Jan 05 13:49:09 crc systemd[1]: Starting Kubernetes Kubelet... Jan 05 13:49:09 crc restorecon[4701]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:09 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 05 13:49:10 crc restorecon[4701]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 05 13:49:10 crc kubenswrapper[4740]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 05 13:49:10 crc kubenswrapper[4740]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 05 13:49:10 crc kubenswrapper[4740]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 05 13:49:10 crc kubenswrapper[4740]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 05 13:49:10 crc kubenswrapper[4740]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 05 13:49:10 crc kubenswrapper[4740]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.810579 4740 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812876 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812901 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812906 4740 feature_gate.go:330] unrecognized feature gate: Example Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812910 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812913 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812918 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812921 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812925 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812930 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812935 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812939 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812944 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812949 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812953 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812957 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812961 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812964 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812970 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812975 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812979 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812983 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812987 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812991 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812994 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.812998 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813002 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813005 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813009 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813013 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813018 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813023 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813027 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813031 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813036 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813041 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813045 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813049 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813054 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813071 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813076 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813081 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813085 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813089 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813094 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813098 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813103 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813107 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813111 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813115 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813119 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813123 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813127 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813130 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813134 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813138 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813142 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813148 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813152 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813156 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813161 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813164 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813168 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813172 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813176 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813193 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813197 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813202 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813207 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813211 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813214 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.813218 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813429 4740 flags.go:64] FLAG: --address="0.0.0.0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813441 4740 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813447 4740 flags.go:64] FLAG: --anonymous-auth="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813453 4740 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813458 4740 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813463 4740 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813468 4740 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813474 4740 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813478 4740 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813482 4740 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813486 4740 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813491 4740 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813495 4740 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813499 4740 flags.go:64] FLAG: --cgroup-root="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813504 4740 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813508 4740 flags.go:64] FLAG: --client-ca-file="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813513 4740 flags.go:64] FLAG: --cloud-config="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813517 4740 flags.go:64] FLAG: --cloud-provider="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813521 4740 flags.go:64] FLAG: --cluster-dns="[]" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813532 4740 flags.go:64] FLAG: --cluster-domain="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813537 4740 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813541 4740 flags.go:64] FLAG: --config-dir="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813545 4740 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813549 4740 flags.go:64] FLAG: --container-log-max-files="5" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813555 4740 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813559 4740 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813563 4740 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813567 4740 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813571 4740 flags.go:64] FLAG: --contention-profiling="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813575 4740 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813579 4740 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813583 4740 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813588 4740 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813593 4740 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813598 4740 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813602 4740 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813606 4740 flags.go:64] FLAG: --enable-load-reader="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813610 4740 flags.go:64] FLAG: --enable-server="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813614 4740 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813619 4740 flags.go:64] FLAG: --event-burst="100" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813623 4740 flags.go:64] FLAG: --event-qps="50" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813627 4740 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813631 4740 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813635 4740 flags.go:64] FLAG: --eviction-hard="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813640 4740 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813644 4740 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813648 4740 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813652 4740 flags.go:64] FLAG: --eviction-soft="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813656 4740 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813660 4740 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813665 4740 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813669 4740 flags.go:64] FLAG: --experimental-mounter-path="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813673 4740 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813677 4740 flags.go:64] FLAG: --fail-swap-on="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813681 4740 flags.go:64] FLAG: --feature-gates="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813687 4740 flags.go:64] FLAG: --file-check-frequency="20s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813691 4740 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813695 4740 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813699 4740 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813703 4740 flags.go:64] FLAG: --healthz-port="10248" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813707 4740 flags.go:64] FLAG: --help="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813712 4740 flags.go:64] FLAG: --hostname-override="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813716 4740 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813720 4740 flags.go:64] FLAG: --http-check-frequency="20s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813724 4740 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813729 4740 flags.go:64] FLAG: --image-credential-provider-config="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813733 4740 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813737 4740 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813742 4740 flags.go:64] FLAG: --image-service-endpoint="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813746 4740 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813750 4740 flags.go:64] FLAG: --kube-api-burst="100" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813755 4740 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813759 4740 flags.go:64] FLAG: --kube-api-qps="50" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813763 4740 flags.go:64] FLAG: --kube-reserved="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813767 4740 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813771 4740 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813776 4740 flags.go:64] FLAG: --kubelet-cgroups="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813780 4740 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813784 4740 flags.go:64] FLAG: --lock-file="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813788 4740 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813792 4740 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813796 4740 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813802 4740 flags.go:64] FLAG: --log-json-split-stream="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813806 4740 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813810 4740 flags.go:64] FLAG: --log-text-split-stream="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813814 4740 flags.go:64] FLAG: --logging-format="text" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813818 4740 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813822 4740 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813826 4740 flags.go:64] FLAG: --manifest-url="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813830 4740 flags.go:64] FLAG: --manifest-url-header="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813836 4740 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813840 4740 flags.go:64] FLAG: --max-open-files="1000000" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813845 4740 flags.go:64] FLAG: --max-pods="110" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813850 4740 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813854 4740 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813858 4740 flags.go:64] FLAG: --memory-manager-policy="None" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813862 4740 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813866 4740 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813871 4740 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813875 4740 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813884 4740 flags.go:64] FLAG: --node-status-max-images="50" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813889 4740 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813894 4740 flags.go:64] FLAG: --oom-score-adj="-999" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813915 4740 flags.go:64] FLAG: --pod-cidr="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813921 4740 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813927 4740 flags.go:64] FLAG: --pod-manifest-path="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813931 4740 flags.go:64] FLAG: --pod-max-pids="-1" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813935 4740 flags.go:64] FLAG: --pods-per-core="0" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813939 4740 flags.go:64] FLAG: --port="10250" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813943 4740 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813947 4740 flags.go:64] FLAG: --provider-id="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813952 4740 flags.go:64] FLAG: --qos-reserved="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813955 4740 flags.go:64] FLAG: --read-only-port="10255" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813960 4740 flags.go:64] FLAG: --register-node="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813964 4740 flags.go:64] FLAG: --register-schedulable="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813968 4740 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813975 4740 flags.go:64] FLAG: --registry-burst="10" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813979 4740 flags.go:64] FLAG: --registry-qps="5" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813983 4740 flags.go:64] FLAG: --reserved-cpus="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813987 4740 flags.go:64] FLAG: --reserved-memory="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813992 4740 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.813997 4740 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814001 4740 flags.go:64] FLAG: --rotate-certificates="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814005 4740 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814009 4740 flags.go:64] FLAG: --runonce="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814013 4740 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814017 4740 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814021 4740 flags.go:64] FLAG: --seccomp-default="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814025 4740 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814030 4740 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814034 4740 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814038 4740 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814042 4740 flags.go:64] FLAG: --storage-driver-password="root" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814046 4740 flags.go:64] FLAG: --storage-driver-secure="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814050 4740 flags.go:64] FLAG: --storage-driver-table="stats" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814054 4740 flags.go:64] FLAG: --storage-driver-user="root" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814071 4740 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814076 4740 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814080 4740 flags.go:64] FLAG: --system-cgroups="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814084 4740 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814091 4740 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814094 4740 flags.go:64] FLAG: --tls-cert-file="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814098 4740 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814103 4740 flags.go:64] FLAG: --tls-min-version="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814108 4740 flags.go:64] FLAG: --tls-private-key-file="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814111 4740 flags.go:64] FLAG: --topology-manager-policy="none" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814115 4740 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814119 4740 flags.go:64] FLAG: --topology-manager-scope="container" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814123 4740 flags.go:64] FLAG: --v="2" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814128 4740 flags.go:64] FLAG: --version="false" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814133 4740 flags.go:64] FLAG: --vmodule="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814138 4740 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814143 4740 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814242 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814246 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814250 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814254 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814258 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814262 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814266 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814270 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814274 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814277 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814281 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814284 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814288 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814291 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814295 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814298 4740 feature_gate.go:330] unrecognized feature gate: Example Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814301 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814305 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814308 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814312 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814316 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814319 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814323 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814327 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814330 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814333 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814337 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814340 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814344 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814347 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814351 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814355 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814360 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814365 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814369 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814372 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814376 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814379 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814383 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814386 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814390 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814393 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814397 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814401 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814404 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814408 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814411 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814415 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814418 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814422 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814425 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814428 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814432 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814435 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814439 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814443 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814448 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814452 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814456 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814460 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814464 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814467 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814474 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814478 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814482 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814485 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814488 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814492 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814495 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814500 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.814504 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.814515 4740 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.822892 4740 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.822928 4740 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823048 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823108 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823120 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823129 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823137 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823144 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823152 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823160 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823168 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823176 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823183 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823191 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823199 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823207 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823216 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823224 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823232 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823239 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823247 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823255 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823262 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823273 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823284 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823292 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823300 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823307 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823315 4740 feature_gate.go:330] unrecognized feature gate: Example Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823322 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823332 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823344 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823354 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823362 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823370 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823379 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823387 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823395 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823403 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823411 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823419 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823428 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823437 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823445 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823453 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823460 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823468 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823476 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823483 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823491 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823501 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823510 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823520 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823528 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823536 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823545 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823553 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823563 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823572 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823580 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823588 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823596 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823603 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823611 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823621 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823629 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823636 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823645 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823653 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823661 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823668 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823679 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823687 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.823702 4740 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823926 4740 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823938 4740 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823947 4740 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823954 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823965 4740 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823974 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823984 4740 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.823993 4740 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824002 4740 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824014 4740 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824022 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824030 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824038 4740 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824046 4740 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824056 4740 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824088 4740 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824096 4740 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824104 4740 feature_gate.go:330] unrecognized feature gate: Example Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824113 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824121 4740 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824128 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824136 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824144 4740 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824154 4740 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824165 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824173 4740 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824181 4740 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824189 4740 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824197 4740 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824206 4740 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824215 4740 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824223 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824230 4740 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824238 4740 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824246 4740 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824254 4740 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824261 4740 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824269 4740 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824276 4740 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824284 4740 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824291 4740 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824299 4740 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824307 4740 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824315 4740 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824322 4740 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824330 4740 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824337 4740 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824345 4740 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824352 4740 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824360 4740 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824369 4740 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824377 4740 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824384 4740 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824392 4740 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824399 4740 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824407 4740 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824415 4740 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824425 4740 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824433 4740 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824442 4740 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824451 4740 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824461 4740 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824469 4740 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824478 4740 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824486 4740 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824493 4740 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824502 4740 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824509 4740 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824517 4740 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824524 4740 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.824532 4740 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.824545 4740 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.825226 4740 server.go:940] "Client rotation is on, will bootstrap in background" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.829384 4740 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.829511 4740 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.830332 4740 server.go:997] "Starting client certificate rotation" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.830387 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.830710 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 03:21:00.660783432 +0000 UTC Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.830848 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.837189 4740 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.839778 4740 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.842302 4740 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.850457 4740 log.go:25] "Validated CRI v1 runtime API" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.866920 4740 log.go:25] "Validated CRI v1 image API" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.868717 4740 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.871096 4740 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-05-13-44-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.871127 4740 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.887749 4740 manager.go:217] Machine: {Timestamp:2026-01-05 13:49:10.886019384 +0000 UTC m=+0.192927993 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4c7bdb16-d3ee-45fb-8dfa-b661cf7f279e BootID:1108599f-421a-4bdb-91aa-32ca52cf5bab Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d9:75:2d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d9:75:2d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d0:95:e6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:da:0f:fc Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:41:01:11 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5d:2a:7c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:42:51:0e:bf:dd:cc Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:cc:17:39:e2:f8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.888022 4740 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.888173 4740 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.888828 4740 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889032 4740 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889088 4740 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889300 4740 topology_manager.go:138] "Creating topology manager with none policy" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889312 4740 container_manager_linux.go:303] "Creating device plugin manager" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889549 4740 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889591 4740 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889767 4740 state_mem.go:36] "Initialized new in-memory state store" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.889853 4740 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.891269 4740 kubelet.go:418] "Attempting to sync node with API server" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.891295 4740 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.891328 4740 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.891413 4740 kubelet.go:324] "Adding apiserver pod source" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.891427 4740 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.892762 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.892920 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.892999 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.893615 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.894970 4740 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.895550 4740 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.897205 4740 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898226 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898274 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898291 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898306 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898329 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898342 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898355 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898378 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898395 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898411 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898437 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898453 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.898997 4740 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.899858 4740 server.go:1280] "Started kubelet" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.900380 4740 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.900397 4740 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.900732 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.901162 4740 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 05 13:49:10 crc systemd[1]: Started Kubernetes Kubelet. Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.903115 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.903146 4740 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.903197 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:08:36.047833277 +0000 UTC Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.903316 4740 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.903358 4740 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.903374 4740 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.903614 4740 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.904477 4740 factory.go:55] Registering systemd factory Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.904517 4740 factory.go:221] Registration of the systemd container factory successfully Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.904913 4740 factory.go:153] Registering CRI-O factory Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.904936 4740 factory.go:221] Registration of the crio container factory successfully Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.905042 4740 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.905136 4740 factory.go:103] Registering Raw factory Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.905170 4740 manager.go:1196] Started watching for new ooms in manager Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.905145 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.905328 4740 server.go:460] "Adding debug handlers to kubelet server" Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.905328 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.906114 4740 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.97:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1887d9e3bbebf07b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-05 13:49:10.899814523 +0000 UTC m=+0.206723132,LastTimestamp:2026-01-05 13:49:10.899814523 +0000 UTC m=+0.206723132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.906768 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="200ms" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.906832 4740 manager.go:319] Starting recovery of all containers Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922651 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922753 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922777 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922799 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922821 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922840 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922857 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922875 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922897 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922914 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922932 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922950 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.922969 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923029 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923059 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923157 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923176 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923196 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923216 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923236 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923286 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923305 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923323 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923341 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923361 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923379 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923401 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923420 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923439 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923491 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923509 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923529 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923547 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923565 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923592 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923610 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923629 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923647 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923665 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923683 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923701 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923718 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923735 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923752 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923770 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923787 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923806 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923825 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923843 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923862 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923881 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923899 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923923 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923942 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.923998 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924019 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924038 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924056 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924127 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924146 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924165 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924184 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924203 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924225 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924243 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924261 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924279 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924298 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924314 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924334 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924352 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924370 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924388 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924407 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924426 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924444 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924461 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924480 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924497 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924517 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924536 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924555 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924574 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924593 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924610 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924628 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924647 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924665 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924685 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924705 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924723 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924741 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924760 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924778 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924796 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924816 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924834 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924854 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924874 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924891 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924909 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924927 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924945 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924964 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.924990 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925011 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925032 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925052 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925104 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925124 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925147 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925168 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925189 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925210 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925228 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925249 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925267 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925286 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925313 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925339 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925364 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925388 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925409 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925431 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925451 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925470 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925488 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925508 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925526 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925544 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925562 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925579 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925598 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925615 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925633 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925654 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925674 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925691 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925710 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925726 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925752 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925770 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925786 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925804 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925822 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925840 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925887 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925905 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925923 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925940 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925957 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925976 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.925992 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926009 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926027 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926045 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926061 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926107 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926125 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926147 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926165 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926182 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926201 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926219 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926236 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926254 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926271 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926292 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926310 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926331 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926351 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926371 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.926396 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927297 4740 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927336 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927362 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927381 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927400 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927418 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927438 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927467 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927486 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927504 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927522 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927540 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927558 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927574 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927590 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927608 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927680 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927699 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927720 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927739 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927756 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927773 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927792 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927809 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927827 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927846 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927864 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927882 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927899 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927916 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927932 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927951 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927968 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.927986 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.928006 4740 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.928024 4740 reconstruct.go:97] "Volume reconstruction finished" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.928036 4740 reconciler.go:26] "Reconciler: start to sync state" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.942555 4740 manager.go:324] Recovery completed Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.956914 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.958584 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.958725 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.958746 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.959814 4740 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.959841 4740 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.959862 4740 state_mem.go:36] "Initialized new in-memory state store" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.965105 4740 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.966823 4740 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.966910 4740 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.966969 4740 kubelet.go:2335] "Starting kubelet main sync loop" Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.967044 4740 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 05 13:49:10 crc kubenswrapper[4740]: W0105 13:49:10.967927 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:10 crc kubenswrapper[4740]: E0105 13:49:10.967996 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.971531 4740 policy_none.go:49] "None policy: Start" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.975929 4740 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 05 13:49:10 crc kubenswrapper[4740]: I0105 13:49:10.975974 4740 state_mem.go:35] "Initializing new in-memory state store" Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.004422 4740 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.048971 4740 manager.go:334] "Starting Device Plugin manager" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.049043 4740 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.049059 4740 server.go:79] "Starting device plugin registration server" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.050022 4740 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.050115 4740 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.050678 4740 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.050816 4740 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.050858 4740 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.061556 4740 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.068131 4740 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.068289 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.073373 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.073434 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.073464 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.074055 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.074415 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.074522 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.076330 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.076364 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.076377 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.076508 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.076739 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.076808 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.077464 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.077502 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.077521 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.077675 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.077842 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.077894 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078162 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078291 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078403 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078451 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078576 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078593 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.078984 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079018 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079029 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079167 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079364 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079487 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079427 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079689 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.079598 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.080148 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.080179 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.080195 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.080388 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.080430 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.081367 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.081397 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.081408 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.081709 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.081739 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.081751 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.108205 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="400ms" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.130776 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.130828 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.130862 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.130895 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.130924 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131006 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131095 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131242 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131356 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131463 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.131492 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.150759 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.152369 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.152461 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.152481 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.152554 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.153463 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.97:6443: connect: connection refused" node="crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232244 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232314 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232359 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232396 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232440 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232474 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232535 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232559 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232504 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232606 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232627 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232640 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232685 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232728 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232775 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232627 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232781 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232825 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232817 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.232989 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233012 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233029 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233154 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233189 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233220 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233234 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233154 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233332 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.233368 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.354557 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.356774 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.356854 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.356874 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.356941 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.357837 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.97:6443: connect: connection refused" node="crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.416993 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.426368 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.445624 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.454817 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2357102f8226e10f115622a5e3db8c68b5c7c673f74bb7d39571d42f7315e860 WatchSource:0}: Error finding container 2357102f8226e10f115622a5e3db8c68b5c7c673f74bb7d39571d42f7315e860: Status 404 returned error can't find the container with id 2357102f8226e10f115622a5e3db8c68b5c7c673f74bb7d39571d42f7315e860 Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.458389 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-5da5414e0766b74a5c97f0534563b5169a17c8143c673657da9d1c3b087b8314 WatchSource:0}: Error finding container 5da5414e0766b74a5c97f0534563b5169a17c8143c673657da9d1c3b087b8314: Status 404 returned error can't find the container with id 5da5414e0766b74a5c97f0534563b5169a17c8143c673657da9d1c3b087b8314 Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.464879 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.471720 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.472585 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1eece33bdc2dd5d5736cfc2084a069a5c78074dedb87afd0fdce0ae46b2fce6d WatchSource:0}: Error finding container 1eece33bdc2dd5d5736cfc2084a069a5c78074dedb87afd0fdce0ae46b2fce6d: Status 404 returned error can't find the container with id 1eece33bdc2dd5d5736cfc2084a069a5c78074dedb87afd0fdce0ae46b2fce6d Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.493428 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-41b895f0c5677709fe4e84f42ddfd220ee74e8074145296258dd484ac8b7cef3 WatchSource:0}: Error finding container 41b895f0c5677709fe4e84f42ddfd220ee74e8074145296258dd484ac8b7cef3: Status 404 returned error can't find the container with id 41b895f0c5677709fe4e84f42ddfd220ee74e8074145296258dd484ac8b7cef3 Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.509764 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="800ms" Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.513522 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fd57cb7f476f9afc1793ed16c00e95c7738b51391ba989f42c0584be4a327c78 WatchSource:0}: Error finding container fd57cb7f476f9afc1793ed16c00e95c7738b51391ba989f42c0584be4a327c78: Status 404 returned error can't find the container with id fd57cb7f476f9afc1793ed16c00e95c7738b51391ba989f42c0584be4a327c78 Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.743468 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.743600 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.757983 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.762434 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.762483 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.762500 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.762546 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.763103 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.97:6443: connect: connection refused" node="crc" Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.796632 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.796712 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.840755 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.840850 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.902011 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.904098 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 06:29:17.726977961 +0000 UTC Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.904137 4740 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 280h40m5.822842647s for next certificate rotation Jan 05 13:49:11 crc kubenswrapper[4740]: W0105 13:49:11.952310 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.97:6443: connect: connection refused Jan 05 13:49:11 crc kubenswrapper[4740]: E0105 13:49:11.952405 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.97:6443: connect: connection refused" logger="UnhandledError" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.973398 4740 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="07ddbb01afc2e8bc57154538291cd8fdd8cec262c2166175ec50ceb5970b56b5" exitCode=0 Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.973489 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"07ddbb01afc2e8bc57154538291cd8fdd8cec262c2166175ec50ceb5970b56b5"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.973586 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fd57cb7f476f9afc1793ed16c00e95c7738b51391ba989f42c0584be4a327c78"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.973724 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975178 4740 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="54635c45ea8594ccf97c9eef7c9eeec37b954e2cce319a059783791b84d2e0df" exitCode=0 Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975251 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"54635c45ea8594ccf97c9eef7c9eeec37b954e2cce319a059783791b84d2e0df"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975280 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"41b895f0c5677709fe4e84f42ddfd220ee74e8074145296258dd484ac8b7cef3"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975378 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975578 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975617 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.975630 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.976191 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.976223 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.976235 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.978624 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72eeec19f8ced0f5b2113145af58085f9ec9942ac6f7d1d106f740f869a8b70a"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.978695 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1eece33bdc2dd5d5736cfc2084a069a5c78074dedb87afd0fdce0ae46b2fce6d"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.982172 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70" exitCode=0 Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.982264 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.982301 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5da5414e0766b74a5c97f0534563b5169a17c8143c673657da9d1c3b087b8314"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.982400 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.984309 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.984343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.984355 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.985215 4740 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ea9fbe3260edd1b2a52af69401f3b0670ccd546143990f95a6df3cb091070cfc" exitCode=0 Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.985242 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ea9fbe3260edd1b2a52af69401f3b0670ccd546143990f95a6df3cb091070cfc"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.985260 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2357102f8226e10f115622a5e3db8c68b5c7c673f74bb7d39571d42f7315e860"} Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.985343 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.986749 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.986773 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.986784 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.990673 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.991446 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.991474 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:11 crc kubenswrapper[4740]: I0105 13:49:11.991488 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:12 crc kubenswrapper[4740]: E0105 13:49:12.311078 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="1.6s" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.564957 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.569943 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.569972 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.569996 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.570015 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.863318 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.989944 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2bf32d18ddfece021c1a298a371ab43509bc74664a5c0956bebd100ac38f9d26"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.990133 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.991352 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.991394 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.991410 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.993493 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b3e4eda5b549b17b028a6ba934cda13239c39cfdfeddfc0a9e579fc49f27f430"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.993539 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bd1d03b6d91b1c16eb5c6ecfa017faa5c48a686ffb952d0f9653995a176e9c8d"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.993560 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"79dedd6ea0455cc2fabb5c0fe6242277846d58d5b40e7803a06a70fb8e2841f9"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.993688 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.994662 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.994701 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.994720 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.996648 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"329cfe2cbab91c25015d44f93cc2f5b2206d52516d93b5663bee77b515f177e6"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.996686 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a6df135b0d6006f316637021725c1ef47acb3e92aa21bee4b1ba63e8de46a8b6"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.996700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c00876b02c06dbadf1fee2cd2f7cf24796c1c98336c6e291e910e7a7ceb2ad0"} Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.996709 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.998122 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.998187 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:12 crc kubenswrapper[4740]: I0105 13:49:12.998215 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.001871 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b"} Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.001934 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af"} Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.001967 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372"} Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.001992 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98"} Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.002018 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8"} Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.002174 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.003281 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.003322 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.003341 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.004822 4740 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a2849762094f3f0e7e79b500e9a2e0afd694d1583a6956d360290666c9d1d6b7" exitCode=0 Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.004875 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a2849762094f3f0e7e79b500e9a2e0afd694d1583a6956d360290666c9d1d6b7"} Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.005100 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.006267 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.006313 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.006331 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:13 crc kubenswrapper[4740]: I0105 13:49:13.391500 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.010362 4740 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ecce64a244d02f764485b6fcdd8cd7d64a33c971a44f1dd5f7e00ef340573318" exitCode=0 Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.010452 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ecce64a244d02f764485b6fcdd8cd7d64a33c971a44f1dd5f7e00ef340573318"} Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.010545 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.010617 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.012104 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.012147 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.012163 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.012461 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.012517 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.012541 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.428916 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.429224 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.429314 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.430677 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.430727 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:14 crc kubenswrapper[4740]: I0105 13:49:14.430745 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.018470 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0d54855b10661898d832966d972e1213cc38c467746218d9a182bc8610a562d2"} Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.018505 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.018632 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"515cd3a62da384245ceaca693d8c1c295a3a442accada95516ac561b1d7e8ce7"} Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.018648 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a6e0e19705ac0dc3c01dc0f544187e567c2fbff31524b9cc955251b0e0fbf20"} Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.018660 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"815fec4b43e255897b356c96189c1a7a0231516cc9ec0986f6b4bec802dd3b86"} Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.019580 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.019642 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.019664 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.578201 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:15 crc kubenswrapper[4740]: I0105 13:49:15.744374 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.025952 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"131acc67e3c62003cb5333696870ce451725312c5b1477e19e94e0b7b1caa316"} Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.025995 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.026198 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.026792 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.026818 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.026825 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.027636 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.027668 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.027677 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.414439 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.414694 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.414750 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.416225 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.416287 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:16 crc kubenswrapper[4740]: I0105 13:49:16.416304 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.029334 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.029378 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.031646 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.031698 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.031716 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.031878 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.031942 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.031964 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.806975 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.814197 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.879454 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.879648 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.880878 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.880909 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:17 crc kubenswrapper[4740]: I0105 13:49:17.880920 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:18 crc kubenswrapper[4740]: I0105 13:49:18.031866 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:18 crc kubenswrapper[4740]: I0105 13:49:18.033090 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:18 crc kubenswrapper[4740]: I0105 13:49:18.033150 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:18 crc kubenswrapper[4740]: I0105 13:49:18.033185 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:18 crc kubenswrapper[4740]: I0105 13:49:18.744758 4740 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 13:49:18 crc kubenswrapper[4740]: I0105 13:49:18.744852 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.034859 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.036200 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.036258 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.036270 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.184729 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.184947 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.186825 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.186883 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.186904 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.929939 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.930197 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.931720 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.931805 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:19 crc kubenswrapper[4740]: I0105 13:49:19.931824 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:21 crc kubenswrapper[4740]: E0105 13:49:21.061669 4740 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 05 13:49:21 crc kubenswrapper[4740]: I0105 13:49:21.434873 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 05 13:49:21 crc kubenswrapper[4740]: I0105 13:49:21.435191 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:21 crc kubenswrapper[4740]: I0105 13:49:21.436609 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:21 crc kubenswrapper[4740]: I0105 13:49:21.436660 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:21 crc kubenswrapper[4740]: I0105 13:49:21.436677 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:22 crc kubenswrapper[4740]: E0105 13:49:22.571344 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Jan 05 13:49:22 crc kubenswrapper[4740]: E0105 13:49:22.865137 4740 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 05 13:49:22 crc kubenswrapper[4740]: I0105 13:49:22.902406 4740 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.396642 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.397117 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.398151 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.398191 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.398206 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:23 crc kubenswrapper[4740]: W0105 13:49:23.697208 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.697371 4740 trace.go:236] Trace[1799704314]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Jan-2026 13:49:13.696) (total time: 10001ms): Jan 05 13:49:23 crc kubenswrapper[4740]: Trace[1799704314]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:49:23.697) Jan 05 13:49:23 crc kubenswrapper[4740]: Trace[1799704314]: [10.0012113s] [10.0012113s] END Jan 05 13:49:23 crc kubenswrapper[4740]: E0105 13:49:23.697407 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 05 13:49:23 crc kubenswrapper[4740]: W0105 13:49:23.810878 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 05 13:49:23 crc kubenswrapper[4740]: I0105 13:49:23.811054 4740 trace.go:236] Trace[1648088139]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Jan-2026 13:49:13.808) (total time: 10002ms): Jan 05 13:49:23 crc kubenswrapper[4740]: Trace[1648088139]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:49:23.810) Jan 05 13:49:23 crc kubenswrapper[4740]: Trace[1648088139]: [10.002188725s] [10.002188725s] END Jan 05 13:49:23 crc kubenswrapper[4740]: E0105 13:49:23.811156 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 05 13:49:23 crc kubenswrapper[4740]: E0105 13:49:23.912855 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Jan 05 13:49:24 crc kubenswrapper[4740]: W0105 13:49:24.145970 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.146158 4740 trace.go:236] Trace[982282433]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Jan-2026 13:49:14.143) (total time: 10002ms): Jan 05 13:49:24 crc kubenswrapper[4740]: Trace[982282433]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (13:49:24.145) Jan 05 13:49:24 crc kubenswrapper[4740]: Trace[982282433]: [10.002338239s] [10.002338239s] END Jan 05 13:49:24 crc kubenswrapper[4740]: E0105 13:49:24.146191 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 05 13:49:24 crc kubenswrapper[4740]: W0105 13:49:24.157813 4740 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.157939 4740 trace.go:236] Trace[1931025649]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Jan-2026 13:49:14.156) (total time: 10001ms): Jan 05 13:49:24 crc kubenswrapper[4740]: Trace[1931025649]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (13:49:24.157) Jan 05 13:49:24 crc kubenswrapper[4740]: Trace[1931025649]: [10.001084817s] [10.001084817s] END Jan 05 13:49:24 crc kubenswrapper[4740]: E0105 13:49:24.157968 4740 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.172401 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.174617 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.174705 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.174732 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.174780 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.430130 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 13:49:24 crc kubenswrapper[4740]: I0105 13:49:24.430228 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 13:49:26 crc kubenswrapper[4740]: E0105 13:49:26.180938 4740 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.1887d9e3bbebf07b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-05 13:49:10.899814523 +0000 UTC m=+0.206723132,LastTimestamp:2026-01-05 13:49:10.899814523 +0000 UTC m=+0.206723132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 05 13:49:27 crc kubenswrapper[4740]: I0105 13:49:27.015409 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 05 13:49:28 crc kubenswrapper[4740]: I0105 13:49:28.054618 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 05 13:49:28 crc kubenswrapper[4740]: I0105 13:49:28.054690 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 05 13:49:28 crc kubenswrapper[4740]: I0105 13:49:28.746205 4740 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 13:49:28 crc kubenswrapper[4740]: I0105 13:49:28.746264 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 13:49:29 crc kubenswrapper[4740]: I0105 13:49:29.437415 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:29 crc kubenswrapper[4740]: I0105 13:49:29.437674 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:29 crc kubenswrapper[4740]: I0105 13:49:29.439154 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:29 crc kubenswrapper[4740]: I0105 13:49:29.439255 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:29 crc kubenswrapper[4740]: I0105 13:49:29.439274 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:29 crc kubenswrapper[4740]: I0105 13:49:29.444519 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:30 crc kubenswrapper[4740]: I0105 13:49:30.064418 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 13:49:30 crc kubenswrapper[4740]: I0105 13:49:30.064494 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:30 crc kubenswrapper[4740]: I0105 13:49:30.065831 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:30 crc kubenswrapper[4740]: I0105 13:49:30.065892 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:30 crc kubenswrapper[4740]: I0105 13:49:30.065911 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:30 crc kubenswrapper[4740]: I0105 13:49:30.156052 4740 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 05 13:49:31 crc kubenswrapper[4740]: E0105 13:49:31.061865 4740 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 05 13:49:31 crc kubenswrapper[4740]: I0105 13:49:31.520576 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 05 13:49:31 crc kubenswrapper[4740]: I0105 13:49:31.520789 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:31 crc kubenswrapper[4740]: I0105 13:49:31.522131 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:31 crc kubenswrapper[4740]: I0105 13:49:31.522205 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:31 crc kubenswrapper[4740]: I0105 13:49:31.522225 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:31 crc kubenswrapper[4740]: I0105 13:49:31.539727 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 05 13:49:32 crc kubenswrapper[4740]: I0105 13:49:32.068765 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:32 crc kubenswrapper[4740]: I0105 13:49:32.069671 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:32 crc kubenswrapper[4740]: I0105 13:49:32.069716 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:32 crc kubenswrapper[4740]: I0105 13:49:32.069748 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.058470 4740 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.058807 4740 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 05 13:49:33 crc kubenswrapper[4740]: E0105 13:49:33.060332 4740 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.067556 4740 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.123679 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:47470->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.123754 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:47470->192.168.126.11:17697: read: connection reset by peer" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.123788 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46042->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.123860 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46042->192.168.126.11:17697: read: connection reset by peer" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.124147 4740 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.124211 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.576940 4740 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.905690 4740 apiserver.go:52] "Watching apiserver" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.909442 4740 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.910057 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.910504 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.910517 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.910992 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:33 crc kubenswrapper[4740]: E0105 13:49:33.911039 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.911122 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.911097 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.911163 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:33 crc kubenswrapper[4740]: E0105 13:49:33.911090 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:33 crc kubenswrapper[4740]: E0105 13:49:33.911310 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.913735 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.914042 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.914196 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.914308 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.914749 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.915038 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.915336 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.915697 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.916576 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.954039 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.976496 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:33 crc kubenswrapper[4740]: I0105 13:49:33.989784 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.003857 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.004371 4740 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.018339 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.035046 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.051574 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066455 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066538 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066592 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066647 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066699 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066749 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066799 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066848 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066900 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066948 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.066995 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067132 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067188 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067242 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067298 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067352 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067406 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067455 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067506 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067560 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067614 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067662 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067712 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067760 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067816 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067865 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067915 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067968 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068024 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068111 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068166 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068217 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068268 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067241 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067238 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067552 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068322 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068370 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068396 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068417 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068438 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068458 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068478 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068498 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068519 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068540 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068559 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068582 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068601 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068622 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068642 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068663 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068684 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068703 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068728 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068748 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068770 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068793 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068812 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068832 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068853 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068874 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068894 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068920 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068948 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068977 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069009 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069044 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069097 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069132 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069168 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069220 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069251 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069286 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069316 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069345 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069374 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069401 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069432 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069462 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069493 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069521 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069553 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069583 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069613 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069645 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069676 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069707 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069738 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069770 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069798 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069820 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069840 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069879 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069910 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069939 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069970 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069999 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070031 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070083 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070121 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070151 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070221 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070254 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070286 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070318 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070349 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070377 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070411 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070445 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070477 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070507 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070545 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070576 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070607 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070637 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070664 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070721 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070752 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070780 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070909 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070945 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070974 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071003 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071030 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071087 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071122 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071148 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071171 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071196 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071226 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071255 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071502 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071869 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071911 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071943 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071971 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072003 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072033 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072092 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072129 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072169 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072206 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072238 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072270 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072314 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072345 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072376 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072409 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072441 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072475 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072514 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072547 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072577 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072605 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072637 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072668 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072698 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072731 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072762 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072797 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072830 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072861 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072891 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072917 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072940 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072961 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072987 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073010 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073041 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073093 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073125 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073158 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073223 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073247 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073269 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073293 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073316 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073339 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073364 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073393 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073424 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073459 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073495 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073530 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073563 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073596 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073630 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073664 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073695 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073728 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073761 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073795 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073824 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073857 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073912 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073945 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073972 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073999 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074035 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074091 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074131 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074175 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074209 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074242 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074279 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074314 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074351 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074386 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074457 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074545 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074570 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068320 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067596 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.081569 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:34.58154684 +0000 UTC m=+23.888455429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067618 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067748 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067931 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067962 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.067973 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068265 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068263 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068498 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.068653 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.081687 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069554 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069787 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.069908 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070007 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070359 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070373 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.070692 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071088 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071142 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071292 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071759 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.071977 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072317 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072343 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072346 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072687 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072729 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.072823 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073575 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073623 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.073824 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074208 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074526 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.074920 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.075121 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.075258 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.075371 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.075445 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.075656 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.075957 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.076007 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.076217 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.076249 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.076900 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077118 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.082927 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077112 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077136 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077346 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077434 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077218 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077669 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077774 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077791 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.077924 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.078115 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.078519 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.078626 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.078733 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.079380 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.079467 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.079502 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.079748 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.079924 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.079934 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.080058 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.080270 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.081047 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.081152 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.081502 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.081863 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.082271 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.082799 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.082967 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.082974 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.083264 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.083372 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.082915 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.083258 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.083854 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.084633 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.085876 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.086137 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.086407 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.086543 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.086558 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.086667 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.085984 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.086985 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.087438 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.087450 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.087578 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.087603 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.087838 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.087898 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.088012 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.088119 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.088215 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.090379 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.090567 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.090695 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.090749 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.090766 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.091030 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.091349 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.091402 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.091840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.091912 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.092506 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.092709 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.092748 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093594 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093619 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093026 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093776 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093949 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093969 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093208 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.093281 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094337 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.094390 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:34.594214435 +0000 UTC m=+23.901123124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093239 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093543 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094453 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094454 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094468 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094602 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.094665 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094740 4740 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094795 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094847 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.094900 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:34.594882582 +0000 UTC m=+23.901791161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.094963 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.095161 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.095162 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.093047 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.095355 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.095486 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.095697 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.095811 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.096383 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.097943 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.098437 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.098715 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.098892 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.098985 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.099610 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.100561 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.100653 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.100708 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.100766 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.101052 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.102061 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.114764 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.115280 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b" exitCode=255 Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.115359 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b"} Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.115540 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.116157 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.116284 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.117346 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.117379 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.117414 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.117434 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.117581 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:34.617560706 +0000 UTC m=+23.924469375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.118825 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.120492 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.120529 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.120775 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.120797 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.121270 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.121268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.121330 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:34.621311503 +0000 UTC m=+23.928220082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.121806 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.121898 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.121944 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.122001 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.122100 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.122148 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.122175 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.122856 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.123401 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.123825 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.124019 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.124053 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.124613 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.124713 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.124932 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.127940 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.128708 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129017 4740 scope.go:117] "RemoveContainer" containerID="ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129290 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129302 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129364 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129440 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129961 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.129998 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.130160 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.130181 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.130193 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.130307 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.130577 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.131197 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.131734 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.132168 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.132523 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.133001 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.133185 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.133389 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.133626 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.133759 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.134235 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.134375 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.134430 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.143122 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.155771 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.156484 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.164036 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.166049 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.167428 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175698 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175857 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175922 4740 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175937 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175951 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175969 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175981 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175992 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176003 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176015 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176028 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.175998 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176039 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176159 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176172 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176185 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176197 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176209 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176105 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176244 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176339 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176351 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176362 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176373 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176400 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176412 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176422 4740 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176462 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176576 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176645 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176661 4740 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176674 4740 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176685 4740 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176906 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176920 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.176956 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177041 4740 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177055 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177389 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177403 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177415 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177426 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177436 4740 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177496 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177507 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177518 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177530 4740 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177587 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177602 4740 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177613 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177623 4740 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177634 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177644 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177659 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177669 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177826 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177843 4740 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177853 4740 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177864 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177876 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177889 4740 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177901 4740 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177911 4740 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177921 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177932 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177942 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177952 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.177962 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178031 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178045 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178113 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178130 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178141 4740 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178153 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178164 4740 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178174 4740 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178184 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178196 4740 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178206 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178262 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178275 4740 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178286 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178296 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178307 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178318 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178369 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178382 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178395 4740 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178435 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178447 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178456 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178536 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178549 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178612 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178624 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178635 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178645 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178656 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178703 4740 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178714 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178724 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178794 4740 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178806 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178817 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178827 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.178950 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179026 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179038 4740 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179101 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179113 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179124 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179196 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179210 4740 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179291 4740 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179851 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179867 4740 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179877 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179885 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179894 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179902 4740 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179933 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179942 4740 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179951 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179960 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179968 4740 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.179977 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180004 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180016 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180025 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180034 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180043 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180054 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180089 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180098 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180107 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180117 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180126 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180135 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180164 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180174 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180182 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180191 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180200 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180220 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180252 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180262 4740 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180270 4740 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180280 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180287 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180297 4740 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180326 4740 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180335 4740 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180344 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180352 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180360 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180369 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180377 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180449 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180461 4740 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180575 4740 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180584 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180592 4740 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180600 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180609 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180619 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180649 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180659 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180668 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180677 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180685 4740 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180694 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180702 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180730 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180738 4740 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180746 4740 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180754 4740 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180764 4740 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180771 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180779 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180787 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180802 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180813 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180823 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180833 4740 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180842 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180850 4740 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180858 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180866 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180874 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180882 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180889 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180897 4740 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180905 4740 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180914 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.180924 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.181683 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.188560 4740 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.192571 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.235240 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 05 13:49:34 crc kubenswrapper[4740]: W0105 13:49:34.247600 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-37c56d1aeea74a6d13daf7ae0162597b3b621537a6cccbd386f615d402201cd9 WatchSource:0}: Error finding container 37c56d1aeea74a6d13daf7ae0162597b3b621537a6cccbd386f615d402201cd9: Status 404 returned error can't find the container with id 37c56d1aeea74a6d13daf7ae0162597b3b621537a6cccbd386f615d402201cd9 Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.255239 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 05 13:49:34 crc kubenswrapper[4740]: W0105 13:49:34.265652 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-db90e56a05dd72646182bcded8580d8397837a90d3bfbaa1e16420cecc042986 WatchSource:0}: Error finding container db90e56a05dd72646182bcded8580d8397837a90d3bfbaa1e16420cecc042986: Status 404 returned error can't find the container with id db90e56a05dd72646182bcded8580d8397837a90d3bfbaa1e16420cecc042986 Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.267675 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 05 13:49:34 crc kubenswrapper[4740]: W0105 13:49:34.292333 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e09b00ec73fb67f7ca316fffd08aa654f1c7eadca19364e57d2276dc09527be5 WatchSource:0}: Error finding container e09b00ec73fb67f7ca316fffd08aa654f1c7eadca19364e57d2276dc09527be5: Status 404 returned error can't find the container with id e09b00ec73fb67f7ca316fffd08aa654f1c7eadca19364e57d2276dc09527be5 Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.584509 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.584665 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:35.584642308 +0000 UTC m=+24.891550887 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.685882 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.685938 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.685963 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.685997 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686122 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686152 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686169 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686174 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686221 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686240 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686182 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686304 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686198 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:35.686178181 +0000 UTC m=+24.993086780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686464 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:35.686436497 +0000 UTC m=+24.993345116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686502 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:35.686487879 +0000 UTC m=+24.993396498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:34 crc kubenswrapper[4740]: E0105 13:49:34.686530 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:35.68652005 +0000 UTC m=+24.993428669 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.972976 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.974134 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.976487 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.977869 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.979932 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.981021 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.982375 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.984428 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.985736 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.987844 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.988987 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.991306 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.992405 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.993630 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.995629 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.996798 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.998867 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 05 13:49:34 crc kubenswrapper[4740]: I0105 13:49:34.999828 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.001374 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.002581 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.003569 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.004830 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.005773 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.007134 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.008044 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.009325 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.012296 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.012803 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.013464 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.013933 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.014395 4740 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.014495 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.015746 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.016221 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.016602 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.017833 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.018488 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.018993 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.019632 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.020320 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.020757 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.023946 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.024903 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.025483 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.026262 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.026750 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.027625 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.028432 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.029210 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.029667 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.030114 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.030907 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.031454 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.032347 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.118768 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"db90e56a05dd72646182bcded8580d8397837a90d3bfbaa1e16420cecc042986"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.120370 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"19da28657f256512f8552af41d50e693bd808c27f52ac30d2042eca2855360c6"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.120464 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"37c56d1aeea74a6d13daf7ae0162597b3b621537a6cccbd386f615d402201cd9"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.122262 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.124255 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.124369 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.126126 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6230472daa53872089f29bc47de307dfb2afdd3d583dac19d2d672b176339411"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.126473 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c52bafa38abd2a6708555d55da023aa5945477ed01ecdb5f6721b26267eeab21"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.126554 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e09b00ec73fb67f7ca316fffd08aa654f1c7eadca19364e57d2276dc09527be5"} Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.139596 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.175053 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff7aaeed-7c7e-43d0-bf51-9f92524eae1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0105 13:49:23.314249 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0105 13:49:23.315666 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1133732249/tls.crt::/tmp/serving-cert-1133732249/tls.key\\\\\\\"\\\\nI0105 13:49:33.094625 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0105 13:49:33.101402 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0105 13:49:33.101435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0105 13:49:33.101470 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0105 13:49:33.101481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0105 13:49:33.109430 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0105 13:49:33.109479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0105 13:49:33.109513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0105 13:49:33.109520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0105 13:49:33.109530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0105 13:49:33.109663 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0105 13:49:33.112896 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-05T13:49:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.194019 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.220228 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.236132 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.256211 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.280779 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19da28657f256512f8552af41d50e693bd808c27f52ac30d2042eca2855360c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.305801 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19da28657f256512f8552af41d50e693bd808c27f52ac30d2042eca2855360c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.319642 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.338917 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6230472daa53872089f29bc47de307dfb2afdd3d583dac19d2d672b176339411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52bafa38abd2a6708555d55da023aa5945477ed01ecdb5f6721b26267eeab21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.355420 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff7aaeed-7c7e-43d0-bf51-9f92524eae1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0105 13:49:23.314249 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0105 13:49:23.315666 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1133732249/tls.crt::/tmp/serving-cert-1133732249/tls.key\\\\\\\"\\\\nI0105 13:49:33.094625 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0105 13:49:33.101402 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0105 13:49:33.101435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0105 13:49:33.101470 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0105 13:49:33.101481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0105 13:49:33.109430 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0105 13:49:33.109479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0105 13:49:33.109513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0105 13:49:33.109520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0105 13:49:33.109530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0105 13:49:33.109663 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0105 13:49:33.112896 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-05T13:49:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.367554 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.420452 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.442112 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.594020 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.594211 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:37.594183811 +0000 UTC m=+26.901092390 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.695615 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.695665 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.695691 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.695712 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695810 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695864 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:37.695848858 +0000 UTC m=+27.002757447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695907 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695919 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695970 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695991 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695991 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.696013 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.696026 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.695931 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:37.69592334 +0000 UTC m=+27.002831929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.696151 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:37.696124335 +0000 UTC m=+27.003032954 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.696174 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:37.696162456 +0000 UTC m=+27.003071075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.750490 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.756523 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.763286 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.773949 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19da28657f256512f8552af41d50e693bd808c27f52ac30d2042eca2855360c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.793758 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.817375 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6230472daa53872089f29bc47de307dfb2afdd3d583dac19d2d672b176339411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52bafa38abd2a6708555d55da023aa5945477ed01ecdb5f6721b26267eeab21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.844932 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff7aaeed-7c7e-43d0-bf51-9f92524eae1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0105 13:49:23.314249 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0105 13:49:23.315666 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1133732249/tls.crt::/tmp/serving-cert-1133732249/tls.key\\\\\\\"\\\\nI0105 13:49:33.094625 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0105 13:49:33.101402 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0105 13:49:33.101435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0105 13:49:33.101470 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0105 13:49:33.101481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0105 13:49:33.109430 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0105 13:49:33.109479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0105 13:49:33.109513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0105 13:49:33.109520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0105 13:49:33.109530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0105 13:49:33.109663 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0105 13:49:33.112896 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-05T13:49:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.862658 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.883896 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.908698 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.927123 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e4f7030-4ffc-4270-a7bb-b6c187b71791\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c00876b02c06dbadf1fee2cd2f7cf24796c1c98336c6e291e910e7a7ceb2ad0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72eeec19f8ced0f5b2113145af58085f9ec9942ac6f7d1d106f740f869a8b70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6df135b0d6006f316637021725c1ef47acb3e92aa21bee4b1ba63e8de46a8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://329cfe2cbab91c25015d44f93cc2f5b2206d52516d93b5663bee77b515f177e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-05T13:49:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.949407 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19da28657f256512f8552af41d50e693bd808c27f52ac30d2042eca2855360c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.965173 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.967923 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.967962 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.967930 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.968054 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.968203 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:35 crc kubenswrapper[4740]: E0105 13:49:35.968333 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.977018 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6230472daa53872089f29bc47de307dfb2afdd3d583dac19d2d672b176339411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52bafa38abd2a6708555d55da023aa5945477ed01ecdb5f6721b26267eeab21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:35 crc kubenswrapper[4740]: I0105 13:49:35.994746 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff7aaeed-7c7e-43d0-bf51-9f92524eae1a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0105 13:49:23.314249 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0105 13:49:23.315666 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1133732249/tls.crt::/tmp/serving-cert-1133732249/tls.key\\\\\\\"\\\\nI0105 13:49:33.094625 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0105 13:49:33.101402 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0105 13:49:33.101435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0105 13:49:33.101470 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0105 13:49:33.101481 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0105 13:49:33.109430 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0105 13:49:33.109479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109493 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0105 13:49:33.109505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0105 13:49:33.109513 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0105 13:49:33.109520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0105 13:49:33.109530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0105 13:49:33.109663 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0105 13:49:33.112896 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-05T13:49:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-05T13:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-05T13:49:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-05T13:49:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:35Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.008313 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:36Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.024455 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:36Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.035099 4740 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:36Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.260929 4740 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.263279 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.263343 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.263362 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.263449 4740 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.272883 4740 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.273234 4740 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.274573 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.274628 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.274646 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.274670 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.274687 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-05T13:49:36Z","lastTransitionTime":"2026-01-05T13:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 05 13:49:36 crc kubenswrapper[4740]: E0105 13:49:36.303988 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-05T13:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1108599f-421a-4bdb-91aa-32ca52cf5bab\\\",\\\"systemUUID\\\":\\\"4c7bdb16-d3ee-45fb-8dfa-b661cf7f279e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-05T13:49:36Z is after 2025-08-24T17:21:41Z" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.309821 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.309891 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.309913 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.309948 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.309973 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-05T13:49:36Z","lastTransitionTime":"2026-01-05T13:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.345968 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.346013 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.346025 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.346042 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 05 13:49:36 crc kubenswrapper[4740]: I0105 13:49:36.346055 4740 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-05T13:49:36Z","lastTransitionTime":"2026-01-05T13:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.613650 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.613943 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:41.613917825 +0000 UTC m=+30.920826444 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.715307 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.715398 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.715443 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.715511 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715624 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715672 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715715 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715738 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715741 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715778 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715634 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715798 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715779 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:41.715745097 +0000 UTC m=+31.022653726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715884 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:41.71586193 +0000 UTC m=+31.022770549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715908 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:41.715895301 +0000 UTC m=+31.022803920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.715927 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:41.715917311 +0000 UTC m=+31.022825930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.967297 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.967369 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.967861 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.967728 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:37 crc kubenswrapper[4740]: I0105 13:49:37.967424 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:37 crc kubenswrapper[4740]: E0105 13:49:37.967938 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:38 crc kubenswrapper[4740]: I0105 13:49:38.135821 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5b98d57e0fc7cf8c47f193f72fd235c85281041cf4018bef1ab6937589e74c65"} Jan 05 13:49:38 crc kubenswrapper[4740]: I0105 13:49:38.178394 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=4.178375724 podStartE2EDuration="4.178375724s" podCreationTimestamp="2026-01-05 13:49:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:38.160664568 +0000 UTC m=+27.467573157" watchObservedRunningTime="2026-01-05 13:49:38.178375724 +0000 UTC m=+27.485284303" Jan 05 13:49:38 crc kubenswrapper[4740]: I0105 13:49:38.913703 4740 csr.go:261] certificate signing request csr-xsfvs is approved, waiting to be issued Jan 05 13:49:38 crc kubenswrapper[4740]: I0105 13:49:38.922782 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.922760453 podStartE2EDuration="3.922760453s" podCreationTimestamp="2026-01-05 13:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:38.282748 +0000 UTC m=+27.589656589" watchObservedRunningTime="2026-01-05 13:49:38.922760453 +0000 UTC m=+28.229669042" Jan 05 13:49:38 crc kubenswrapper[4740]: I0105 13:49:38.936580 4740 csr.go:257] certificate signing request csr-xsfvs is issued Jan 05 13:49:38 crc kubenswrapper[4740]: I0105 13:49:38.999663 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8wm6q"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.000148 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.007545 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zjlrn"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.007782 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.010845 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.011458 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.011473 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.011823 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.015451 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.018133 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.019009 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.119320 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-97tfv"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.119662 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.122223 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.122240 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.122584 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.123000 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.126990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ndsx\" (UniqueName: \"kubernetes.io/projected/22482f32-4d2c-44de-9f5e-fc78d450c1c6-kube-api-access-2ndsx\") pod \"node-resolver-8wm6q\" (UID: \"22482f32-4d2c-44de-9f5e-fc78d450c1c6\") " pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.127023 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgn2d\" (UniqueName: \"kubernetes.io/projected/807511ce-a72a-43ff-980d-73a9822afd91-kube-api-access-pgn2d\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.127041 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/807511ce-a72a-43ff-980d-73a9822afd91-host\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.127112 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22482f32-4d2c-44de-9f5e-fc78d450c1c6-hosts-file\") pod \"node-resolver-8wm6q\" (UID: \"22482f32-4d2c-44de-9f5e-fc78d450c1c6\") " pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.127128 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/807511ce-a72a-43ff-980d-73a9822afd91-serviceca\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.129911 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-x67zt"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.130308 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.130663 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.131747 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.132088 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.149918 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5qztt"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.150284 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.150338 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.200282 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xf724"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.200606 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.203689 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.204251 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.204978 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.205135 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.205263 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.212291 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.212561 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.214310 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.214743 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.214760 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.214998 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227678 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgn2d\" (UniqueName: \"kubernetes.io/projected/807511ce-a72a-43ff-980d-73a9822afd91-kube-api-access-pgn2d\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-cnibin\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227739 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-cni-bin\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227761 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-hostroot\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227776 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/807511ce-a72a-43ff-980d-73a9822afd91-host\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227802 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22482f32-4d2c-44de-9f5e-fc78d450c1c6-hosts-file\") pod \"node-resolver-8wm6q\" (UID: \"22482f32-4d2c-44de-9f5e-fc78d450c1c6\") " pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227846 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hmgz\" (UniqueName: \"kubernetes.io/projected/11b442ff-cefe-4a62-bd99-da39c470692e-kube-api-access-7hmgz\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227911 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22482f32-4d2c-44de-9f5e-fc78d450c1c6-hosts-file\") pod \"node-resolver-8wm6q\" (UID: \"22482f32-4d2c-44de-9f5e-fc78d450c1c6\") " pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227917 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/807511ce-a72a-43ff-980d-73a9822afd91-host\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227939 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-system-cni-dir\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.227997 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cnibin\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228034 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-etc-kubernetes\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228091 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11b442ff-cefe-4a62-bd99-da39c470692e-multus-daemon-config\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228122 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cni-binary-copy\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228172 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ndsx\" (UniqueName: \"kubernetes.io/projected/22482f32-4d2c-44de-9f5e-fc78d450c1c6-kube-api-access-2ndsx\") pod \"node-resolver-8wm6q\" (UID: \"22482f32-4d2c-44de-9f5e-fc78d450c1c6\") " pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228202 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-socket-dir-parent\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228236 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-k8s-cni-cncf-io\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-cni-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228301 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228339 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/807511ce-a72a-43ff-980d-73a9822afd91-serviceca\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228376 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-netns\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228409 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-os-release\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228438 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228467 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-os-release\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228495 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-cni-multus\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228525 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-kubelet\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228587 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-system-cni-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228627 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11b442ff-cefe-4a62-bd99-da39c470692e-cni-binary-copy\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228657 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-conf-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228687 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-multus-certs\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.228718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gd6\" (UniqueName: \"kubernetes.io/projected/cbab02ea-f6b8-4598-9d8a-003ec5bed974-kube-api-access-w9gd6\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.229603 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/807511ce-a72a-43ff-980d-73a9822afd91-serviceca\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.248338 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgn2d\" (UniqueName: \"kubernetes.io/projected/807511ce-a72a-43ff-980d-73a9822afd91-kube-api-access-pgn2d\") pod \"node-ca-zjlrn\" (UID: \"807511ce-a72a-43ff-980d-73a9822afd91\") " pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.248565 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ndsx\" (UniqueName: \"kubernetes.io/projected/22482f32-4d2c-44de-9f5e-fc78d450c1c6-kube-api-access-2ndsx\") pod \"node-resolver-8wm6q\" (UID: \"22482f32-4d2c-44de-9f5e-fc78d450c1c6\") " pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.310474 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-btftp"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.310512 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8wm6q" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.311729 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.314828 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.315386 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.315529 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.316265 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.317248 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.317324 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.317438 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.317475 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zjlrn" Jan 05 13:49:39 crc kubenswrapper[4740]: W0105 13:49:39.328732 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22482f32_4d2c_44de_9f5e_fc78d450c1c6.slice/crio-14e8c30a22661561d67cbcab59198627a02dbc0b629b68b138c194e6b2ad7386 WatchSource:0}: Error finding container 14e8c30a22661561d67cbcab59198627a02dbc0b629b68b138c194e6b2ad7386: Status 404 returned error can't find the container with id 14e8c30a22661561d67cbcab59198627a02dbc0b629b68b138c194e6b2ad7386 Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329437 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-netns\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329469 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-os-release\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329486 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329505 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-os-release\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329524 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329520 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-netns\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329541 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7737db78-0989-433f-968a-7e5b441b7537-rootfs\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329611 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-system-cni-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329672 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11b442ff-cefe-4a62-bd99-da39c470692e-cni-binary-copy\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329703 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-cni-multus\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329704 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-os-release\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329724 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-system-cni-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329744 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-kubelet\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329756 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-os-release\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329718 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-kubelet\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329785 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-cni-multus\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329790 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-conf-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329848 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-multus-certs\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329806 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-conf-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329865 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gd6\" (UniqueName: \"kubernetes.io/projected/cbab02ea-f6b8-4598-9d8a-003ec5bed974-kube-api-access-w9gd6\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329939 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7737db78-0989-433f-968a-7e5b441b7537-mcd-auth-proxy-config\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329949 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-multus-certs\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.329980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-cnibin\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330000 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-cni-bin\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330015 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-hostroot\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330034 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330045 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-var-lib-cni-bin\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330049 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330094 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330097 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-cnibin\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330125 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-hostroot\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330157 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hmgz\" (UniqueName: \"kubernetes.io/projected/11b442ff-cefe-4a62-bd99-da39c470692e-kube-api-access-7hmgz\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330178 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-system-cni-dir\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330197 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rxw\" (UniqueName: \"kubernetes.io/projected/7737db78-0989-433f-968a-7e5b441b7537-kube-api-access-x6rxw\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330214 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cnibin\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330230 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7737db78-0989-433f-968a-7e5b441b7537-proxy-tls\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330231 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-system-cni-dir\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330246 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-etc-kubernetes\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330259 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cnibin\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330262 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11b442ff-cefe-4a62-bd99-da39c470692e-multus-daemon-config\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330306 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cni-binary-copy\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-etc-kubernetes\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330335 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-socket-dir-parent\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330366 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-k8s-cni-cncf-io\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330386 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330402 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-host-run-k8s-cni-cncf-io\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330408 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330431 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-cni-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330448 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-socket-dir-parent\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330465 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9xg\" (UniqueName: \"kubernetes.io/projected/b83a0bd7-6f44-4045-bb8c-e80f10959714-kube-api-access-cc9xg\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330454 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/11b442ff-cefe-4a62-bd99-da39c470692e-cni-binary-copy\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330570 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbab02ea-f6b8-4598-9d8a-003ec5bed974-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330611 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/11b442ff-cefe-4a62-bd99-da39c470692e-multus-cni-dir\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330794 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/11b442ff-cefe-4a62-bd99-da39c470692e-multus-daemon-config\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330929 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.330953 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbab02ea-f6b8-4598-9d8a-003ec5bed974-cni-binary-copy\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: W0105 13:49:39.334579 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807511ce_a72a_43ff_980d_73a9822afd91.slice/crio-27d80d9eb666abcfba3a97b13ca19c9464c1b6560be39d0f98a9c9ce31a73a4b WatchSource:0}: Error finding container 27d80d9eb666abcfba3a97b13ca19c9464c1b6560be39d0f98a9c9ce31a73a4b: Status 404 returned error can't find the container with id 27d80d9eb666abcfba3a97b13ca19c9464c1b6560be39d0f98a9c9ce31a73a4b Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.348919 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gd6\" (UniqueName: \"kubernetes.io/projected/cbab02ea-f6b8-4598-9d8a-003ec5bed974-kube-api-access-w9gd6\") pod \"multus-additional-cni-plugins-x67zt\" (UID: \"cbab02ea-f6b8-4598-9d8a-003ec5bed974\") " pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.355228 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hmgz\" (UniqueName: \"kubernetes.io/projected/11b442ff-cefe-4a62-bd99-da39c470692e-kube-api-access-7hmgz\") pod \"multus-97tfv\" (UID: \"11b442ff-cefe-4a62-bd99-da39c470692e\") " pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.431582 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-97tfv" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.431871 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.431917 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.431947 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.431969 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-ovn\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.431992 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432024 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-slash\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432051 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rxw\" (UniqueName: \"kubernetes.io/projected/7737db78-0989-433f-968a-7e5b441b7537-kube-api-access-x6rxw\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432109 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7737db78-0989-433f-968a-7e5b441b7537-proxy-tls\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432131 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-log-socket\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432151 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432171 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432173 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-netns\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432216 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-env-overrides\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432235 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-script-lib\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.432287 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432323 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.432350 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs podName:b83a0bd7-6f44-4045-bb8c-e80f10959714 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:39.932329248 +0000 UTC m=+29.239237947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs") pod "network-metrics-daemon-5qztt" (UID: "b83a0bd7-6f44-4045-bb8c-e80f10959714") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432373 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-bin\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432425 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-config\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432448 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9xg\" (UniqueName: \"kubernetes.io/projected/b83a0bd7-6f44-4045-bb8c-e80f10959714-kube-api-access-cc9xg\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432472 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovn-node-metrics-cert\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432493 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltxk\" (UniqueName: \"kubernetes.io/projected/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-kube-api-access-hltxk\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432588 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-node-log\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432610 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-systemd\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432645 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7737db78-0989-433f-968a-7e5b441b7537-rootfs\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432721 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432800 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7737db78-0989-433f-968a-7e5b441b7537-mcd-auth-proxy-config\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432823 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-systemd-units\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432844 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-etc-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432866 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-netd\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432899 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-kubelet\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.432920 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-var-lib-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.433287 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.433330 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7737db78-0989-433f-968a-7e5b441b7537-rootfs\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.433443 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.433788 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7737db78-0989-433f-968a-7e5b441b7537-mcd-auth-proxy-config\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.436228 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7737db78-0989-433f-968a-7e5b441b7537-proxy-tls\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.440649 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x67zt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.445037 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.448435 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5d2a6a5-1d31-4cb0-bd94-0828a592ff41-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tjf6h\" (UID: \"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.449150 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9xg\" (UniqueName: \"kubernetes.io/projected/b83a0bd7-6f44-4045-bb8c-e80f10959714-kube-api-access-cc9xg\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.449878 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rxw\" (UniqueName: \"kubernetes.io/projected/7737db78-0989-433f-968a-7e5b441b7537-kube-api-access-x6rxw\") pod \"machine-config-daemon-xf724\" (UID: \"7737db78-0989-433f-968a-7e5b441b7537\") " pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.510186 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.526743 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533319 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-config\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533356 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovn-node-metrics-cert\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533375 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltxk\" (UniqueName: \"kubernetes.io/projected/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-kube-api-access-hltxk\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533391 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-node-log\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533409 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-systemd\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533435 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533467 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-systemd-units\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533483 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-etc-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-netd\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533520 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-kubelet\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533534 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-var-lib-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533549 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533568 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-ovn\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533594 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-slash\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533609 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-log-socket\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533622 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533636 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-netns\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533651 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-env-overrides\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533665 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-script-lib\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-bin\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.533740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-bin\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.534339 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-config\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.534717 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-var-lib-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.534984 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-node-log\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535018 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-systemd\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535038 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535071 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-systemd-units\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535094 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-etc-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535115 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-netd\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535136 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-kubelet\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535159 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-log-socket\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535180 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-openvswitch\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535203 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-ovn\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535224 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-slash\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535244 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-netns\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535265 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535559 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-env-overrides\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.535945 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-script-lib\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.538388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovn-node-metrics-cert\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.564572 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltxk\" (UniqueName: \"kubernetes.io/projected/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-kube-api-access-hltxk\") pod \"ovnkube-node-btftp\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: W0105 13:49:39.573933 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5d2a6a5_1d31_4cb0_bd94_0828a592ff41.slice/crio-a4dd6a0d2d0bf3c9b4eb9c25e5af7459196b330dd2c968b089a1293dcfc79252 WatchSource:0}: Error finding container a4dd6a0d2d0bf3c9b4eb9c25e5af7459196b330dd2c968b089a1293dcfc79252: Status 404 returned error can't find the container with id a4dd6a0d2d0bf3c9b4eb9c25e5af7459196b330dd2c968b089a1293dcfc79252 Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.662714 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:39 crc kubenswrapper[4740]: W0105 13:49:39.688933 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aefcfb9_fb79_4a82_a41c_01a94544f6f6.slice/crio-e73f42ee7b6083306d621aa9e56661b99414dd573ac7f19c7043368a4f336d89 WatchSource:0}: Error finding container e73f42ee7b6083306d621aa9e56661b99414dd573ac7f19c7043368a4f336d89: Status 404 returned error can't find the container with id e73f42ee7b6083306d621aa9e56661b99414dd573ac7f19c7043368a4f336d89 Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.692797 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls"] Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.693243 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.695885 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.700637 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.836269 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f084127-cc41-470d-8670-ed75e3f33fde-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.836611 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f084127-cc41-470d-8670-ed75e3f33fde-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.836677 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkv6s\" (UniqueName: \"kubernetes.io/projected/0f084127-cc41-470d-8670-ed75e3f33fde-kube-api-access-vkv6s\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.836706 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f084127-cc41-470d-8670-ed75e3f33fde-env-overrides\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937290 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkv6s\" (UniqueName: \"kubernetes.io/projected/0f084127-cc41-470d-8670-ed75e3f33fde-kube-api-access-vkv6s\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937327 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f084127-cc41-470d-8670-ed75e3f33fde-env-overrides\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937346 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f084127-cc41-470d-8670-ed75e3f33fde-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f084127-cc41-470d-8670-ed75e3f33fde-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.937508 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.937553 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs podName:b83a0bd7-6f44-4045-bb8c-e80f10959714 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:40.937539129 +0000 UTC m=+30.244447708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs") pod "network-metrics-daemon-5qztt" (UID: "b83a0bd7-6f44-4045-bb8c-e80f10959714") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937604 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-05 13:44:38 +0000 UTC, rotation deadline is 2026-11-05 17:54:56.454969258 +0000 UTC Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.937642 4740 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7300h5m16.517329416s for next certificate rotation Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.938030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f084127-cc41-470d-8670-ed75e3f33fde-env-overrides\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.938201 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f084127-cc41-470d-8670-ed75e3f33fde-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.940845 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f084127-cc41-470d-8670-ed75e3f33fde-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.955304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkv6s\" (UniqueName: \"kubernetes.io/projected/0f084127-cc41-470d-8670-ed75e3f33fde-kube-api-access-vkv6s\") pod \"ovnkube-control-plane-749d76644c-smcls\" (UID: \"0f084127-cc41-470d-8670-ed75e3f33fde\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.967698 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.967734 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:39 crc kubenswrapper[4740]: I0105 13:49:39.967708 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.967825 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.967905 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:39 crc kubenswrapper[4740]: E0105 13:49:39.967997 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.019274 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.144327 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" event={"ID":"0f084127-cc41-470d-8670-ed75e3f33fde","Type":"ContainerStarted","Data":"d469e25928aa9dcd3d5e39cfae47ec76cd4152a05f3654e6007de946ba377bc1"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.145750 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tfv" event={"ID":"11b442ff-cefe-4a62-bd99-da39c470692e","Type":"ContainerStarted","Data":"aec9b04ea574c726c59aca5fe6dcbe0cb011dcb7e240cdea847450d2d54e7f4b"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.145802 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tfv" event={"ID":"11b442ff-cefe-4a62-bd99-da39c470692e","Type":"ContainerStarted","Data":"af9c7b06a9ddfa661e7d3bb9c0ba9f40f3e82de2b9c09c4104f2fc369c301dac"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.146793 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c" exitCode=0 Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.146853 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.146879 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"e73f42ee7b6083306d621aa9e56661b99414dd573ac7f19c7043368a4f336d89"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.148727 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8wm6q" event={"ID":"22482f32-4d2c-44de-9f5e-fc78d450c1c6","Type":"ContainerStarted","Data":"d149ebde993726d5ac3a268e7657380933692fa612f138174338c6a3f35be2b9"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.148768 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8wm6q" event={"ID":"22482f32-4d2c-44de-9f5e-fc78d450c1c6","Type":"ContainerStarted","Data":"14e8c30a22661561d67cbcab59198627a02dbc0b629b68b138c194e6b2ad7386"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.154418 4740 generic.go:334] "Generic (PLEG): container finished" podID="cbab02ea-f6b8-4598-9d8a-003ec5bed974" containerID="c1658578eb1eb937253ca81ff16ff2960bc0b90e93ac0d3cd5d0211b8077c67e" exitCode=0 Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.154479 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerDied","Data":"c1658578eb1eb937253ca81ff16ff2960bc0b90e93ac0d3cd5d0211b8077c67e"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.154514 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerStarted","Data":"015cd0de04f18c2ff9044037582014569f8a2ef6378e4251ac3467b45fc4b05d"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.155772 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" event={"ID":"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41","Type":"ContainerStarted","Data":"d56662304fdd9c40ffe0377528091f2188c4de60d9e6ff18d8f20e276b5a9566"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.155826 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" event={"ID":"a5d2a6a5-1d31-4cb0-bd94-0828a592ff41","Type":"ContainerStarted","Data":"a4dd6a0d2d0bf3c9b4eb9c25e5af7459196b330dd2c968b089a1293dcfc79252"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.157841 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"ca9877594a74c4a1d21f62bae1947ed783767c001517c888f96caf0245fe7e0a"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.157885 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"ce8190df163bf1923ad03250cabf835a3e8f9ecb64484dd6a124c97fc8435ba8"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.157898 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"b1f51cf268ae4d08b134e920282a2674210c63b7a1e165e7b025499155e4b3bb"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.161416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zjlrn" event={"ID":"807511ce-a72a-43ff-980d-73a9822afd91","Type":"ContainerStarted","Data":"f3c82a47616cd5ecf7beb615d78f3d4b81ca6c93d7d6a03e6854e5d8f09abab2"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.161434 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zjlrn" event={"ID":"807511ce-a72a-43ff-980d-73a9822afd91","Type":"ContainerStarted","Data":"27d80d9eb666abcfba3a97b13ca19c9464c1b6560be39d0f98a9c9ce31a73a4b"} Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.190892 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-97tfv" podStartSLOduration=1.190871302 podStartE2EDuration="1.190871302s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:40.190729478 +0000 UTC m=+29.497638078" watchObservedRunningTime="2026-01-05 13:49:40.190871302 +0000 UTC m=+29.497779891" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.253559 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tjf6h" podStartSLOduration=1.253534414 podStartE2EDuration="1.253534414s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:40.253334989 +0000 UTC m=+29.560243568" watchObservedRunningTime="2026-01-05 13:49:40.253534414 +0000 UTC m=+29.560442993" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.277511 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8wm6q" podStartSLOduration=2.2774955869999998 podStartE2EDuration="2.277495587s" podCreationTimestamp="2026-01-05 13:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:40.264351325 +0000 UTC m=+29.571259904" watchObservedRunningTime="2026-01-05 13:49:40.277495587 +0000 UTC m=+29.584404166" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.277819 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podStartSLOduration=1.2778144949999999 podStartE2EDuration="1.277814495s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:40.276715577 +0000 UTC m=+29.583624166" watchObservedRunningTime="2026-01-05 13:49:40.277814495 +0000 UTC m=+29.584723074" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.831511 4740 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.958941 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:40 crc kubenswrapper[4740]: E0105 13:49:40.959079 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:40 crc kubenswrapper[4740]: E0105 13:49:40.959166 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs podName:b83a0bd7-6f44-4045-bb8c-e80f10959714 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:42.959142798 +0000 UTC m=+32.266051427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs") pod "network-metrics-daemon-5qztt" (UID: "b83a0bd7-6f44-4045-bb8c-e80f10959714") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.967683 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:40 crc kubenswrapper[4740]: E0105 13:49:40.972791 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:40 crc kubenswrapper[4740]: I0105 13:49:40.984661 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zjlrn" podStartSLOduration=2.984645231 podStartE2EDuration="2.984645231s" podCreationTimestamp="2026-01-05 13:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:40.31144686 +0000 UTC m=+29.618355439" watchObservedRunningTime="2026-01-05 13:49:40.984645231 +0000 UTC m=+30.291553800" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.169917 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" event={"ID":"0f084127-cc41-470d-8670-ed75e3f33fde","Type":"ContainerStarted","Data":"035993cda2861767fa1decdd763035b8133f3d95c75adb7fe96769c6dcee9581"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.170273 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" event={"ID":"0f084127-cc41-470d-8670-ed75e3f33fde","Type":"ContainerStarted","Data":"b2e7eb40ffa26d7bdf24d06f73457ef09a534c85b95869a5cd164161b46b161e"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.177573 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.177636 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.177658 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.177682 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.179228 4740 generic.go:334] "Generic (PLEG): container finished" podID="cbab02ea-f6b8-4598-9d8a-003ec5bed974" containerID="451a172fc8989328e7315d4f5becfece23a5dcce511be10636726c7979318c3c" exitCode=0 Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.179283 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerDied","Data":"451a172fc8989328e7315d4f5becfece23a5dcce511be10636726c7979318c3c"} Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.184592 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-smcls" podStartSLOduration=2.184571844 podStartE2EDuration="2.184571844s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:41.183719882 +0000 UTC m=+30.490628471" watchObservedRunningTime="2026-01-05 13:49:41.184571844 +0000 UTC m=+30.491480423" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.665681 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.666043 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:49.665993974 +0000 UTC m=+38.972902593 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.766528 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.766589 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.766632 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.766666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766751 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766802 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766821 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766835 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766833 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766882 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766905 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766885 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:49.766835238 +0000 UTC m=+39.073743847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766908 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.766942 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:49.766926341 +0000 UTC m=+39.073834930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.767040 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:49.767014383 +0000 UTC m=+39.073922992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.767117 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:49.767103845 +0000 UTC m=+39.074012464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.967190 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.967282 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.967337 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.967471 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:41 crc kubenswrapper[4740]: I0105 13:49:41.967541 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:41 crc kubenswrapper[4740]: E0105 13:49:41.967768 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:42 crc kubenswrapper[4740]: I0105 13:49:42.186818 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261"} Jan 05 13:49:42 crc kubenswrapper[4740]: I0105 13:49:42.186878 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25"} Jan 05 13:49:42 crc kubenswrapper[4740]: I0105 13:49:42.189306 4740 generic.go:334] "Generic (PLEG): container finished" podID="cbab02ea-f6b8-4598-9d8a-003ec5bed974" containerID="8f331000200a155502cf4102571662c8f658229fb0100a01d938cac54f58a80b" exitCode=0 Jan 05 13:49:42 crc kubenswrapper[4740]: I0105 13:49:42.189392 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerDied","Data":"8f331000200a155502cf4102571662c8f658229fb0100a01d938cac54f58a80b"} Jan 05 13:49:42 crc kubenswrapper[4740]: I0105 13:49:42.967877 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:42 crc kubenswrapper[4740]: E0105 13:49:42.968065 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:42 crc kubenswrapper[4740]: I0105 13:49:42.978052 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:42 crc kubenswrapper[4740]: E0105 13:49:42.978219 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:42 crc kubenswrapper[4740]: E0105 13:49:42.978302 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs podName:b83a0bd7-6f44-4045-bb8c-e80f10959714 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:46.978281617 +0000 UTC m=+36.285190206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs") pod "network-metrics-daemon-5qztt" (UID: "b83a0bd7-6f44-4045-bb8c-e80f10959714") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:43 crc kubenswrapper[4740]: I0105 13:49:43.195152 4740 generic.go:334] "Generic (PLEG): container finished" podID="cbab02ea-f6b8-4598-9d8a-003ec5bed974" containerID="b070ddd70d504890ae961cbd4c8dda1063a9abbd9f8436cf8408632532f12970" exitCode=0 Jan 05 13:49:43 crc kubenswrapper[4740]: I0105 13:49:43.195197 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerDied","Data":"b070ddd70d504890ae961cbd4c8dda1063a9abbd9f8436cf8408632532f12970"} Jan 05 13:49:43 crc kubenswrapper[4740]: I0105 13:49:43.967902 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:43 crc kubenswrapper[4740]: I0105 13:49:43.967944 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:43 crc kubenswrapper[4740]: E0105 13:49:43.968439 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:43 crc kubenswrapper[4740]: I0105 13:49:43.968021 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:43 crc kubenswrapper[4740]: E0105 13:49:43.968521 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:43 crc kubenswrapper[4740]: E0105 13:49:43.968831 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:44 crc kubenswrapper[4740]: I0105 13:49:44.206865 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da"} Jan 05 13:49:44 crc kubenswrapper[4740]: I0105 13:49:44.210630 4740 generic.go:334] "Generic (PLEG): container finished" podID="cbab02ea-f6b8-4598-9d8a-003ec5bed974" containerID="a9c468c6c0dd73cdf099a5e4cac5e66ea1d4574e26e8434dc012dcecb0beb4e7" exitCode=0 Jan 05 13:49:44 crc kubenswrapper[4740]: I0105 13:49:44.210659 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerDied","Data":"a9c468c6c0dd73cdf099a5e4cac5e66ea1d4574e26e8434dc012dcecb0beb4e7"} Jan 05 13:49:44 crc kubenswrapper[4740]: I0105 13:49:44.967325 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:44 crc kubenswrapper[4740]: E0105 13:49:44.967501 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:45 crc kubenswrapper[4740]: I0105 13:49:45.217865 4740 generic.go:334] "Generic (PLEG): container finished" podID="cbab02ea-f6b8-4598-9d8a-003ec5bed974" containerID="4e492e7983c3e4c7c32dccdc7a6e4fc6ff53498a35825c7d59114a6e46c7bb5c" exitCode=0 Jan 05 13:49:45 crc kubenswrapper[4740]: I0105 13:49:45.217959 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerDied","Data":"4e492e7983c3e4c7c32dccdc7a6e4fc6ff53498a35825c7d59114a6e46c7bb5c"} Jan 05 13:49:45 crc kubenswrapper[4740]: I0105 13:49:45.967153 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:45 crc kubenswrapper[4740]: I0105 13:49:45.967159 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:45 crc kubenswrapper[4740]: I0105 13:49:45.967816 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:45 crc kubenswrapper[4740]: E0105 13:49:45.967969 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:45 crc kubenswrapper[4740]: E0105 13:49:45.968129 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:45 crc kubenswrapper[4740]: E0105 13:49:45.968674 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.227827 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x67zt" event={"ID":"cbab02ea-f6b8-4598-9d8a-003ec5bed974","Type":"ContainerStarted","Data":"c299098f6d08465bd82a95031b25809aa5daf45db7e99ab5e3b0cd0e5eaa7b12"} Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.234853 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerStarted","Data":"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302"} Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.235255 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.235316 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.235338 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.256462 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x67zt" podStartSLOduration=7.256438234 podStartE2EDuration="7.256438234s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:46.255974201 +0000 UTC m=+35.562882860" watchObservedRunningTime="2026-01-05 13:49:46.256438234 +0000 UTC m=+35.563346853" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.264923 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.270319 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.300244 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podStartSLOduration=7.300222633 podStartE2EDuration="7.300222633s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:46.299742531 +0000 UTC m=+35.606651170" watchObservedRunningTime="2026-01-05 13:49:46.300222633 +0000 UTC m=+35.607131252" Jan 05 13:49:46 crc kubenswrapper[4740]: I0105 13:49:46.967383 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:46 crc kubenswrapper[4740]: E0105 13:49:46.967623 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:47 crc kubenswrapper[4740]: I0105 13:49:47.016624 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:47 crc kubenswrapper[4740]: E0105 13:49:47.016971 4740 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:47 crc kubenswrapper[4740]: E0105 13:49:47.017182 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs podName:b83a0bd7-6f44-4045-bb8c-e80f10959714 nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.017146782 +0000 UTC m=+44.324055361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs") pod "network-metrics-daemon-5qztt" (UID: "b83a0bd7-6f44-4045-bb8c-e80f10959714") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 05 13:49:47 crc kubenswrapper[4740]: I0105 13:49:47.967214 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:47 crc kubenswrapper[4740]: I0105 13:49:47.967232 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:47 crc kubenswrapper[4740]: I0105 13:49:47.967359 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:47 crc kubenswrapper[4740]: E0105 13:49:47.967459 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:47 crc kubenswrapper[4740]: E0105 13:49:47.967822 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:47 crc kubenswrapper[4740]: E0105 13:49:47.967884 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:48 crc kubenswrapper[4740]: I0105 13:49:48.001949 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5qztt"] Jan 05 13:49:48 crc kubenswrapper[4740]: I0105 13:49:48.002064 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:48 crc kubenswrapper[4740]: E0105 13:49:48.002181 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.191785 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.741468 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.741738 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:50:05.741701152 +0000 UTC m=+55.048609771 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.842954 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.843057 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.843173 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843201 4740 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.843257 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843310 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:50:05.843276145 +0000 UTC m=+55.150184734 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843525 4740 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843573 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843662 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843667 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843687 4740 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843707 4740 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843737 4740 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843684 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:50:05.843651165 +0000 UTC m=+55.150559824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843856 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-05 13:50:05.843821559 +0000 UTC m=+55.150730378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.843889 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-05 13:50:05.843874571 +0000 UTC m=+55.150783400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.968296 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.968365 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.968377 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:49 crc kubenswrapper[4740]: I0105 13:49:49.968414 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.968509 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.968626 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.968679 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 05 13:49:49 crc kubenswrapper[4740]: E0105 13:49:49.968832 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5qztt" podUID="b83a0bd7-6f44-4045-bb8c-e80f10959714" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.920104 4740 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.920336 4740 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.976864 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-qj9kj"] Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.977551 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.981309 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l"] Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.981808 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22"] Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.982390 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.982860 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.983048 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk"] Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.983538 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.987958 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.988423 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jvn7z"] Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.988655 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.988767 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.988688 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.989037 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.989233 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.989252 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.989285 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 05 13:49:50 crc kubenswrapper[4740]: I0105 13:49:50.994348 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.002305 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.002569 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7v62c"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.002711 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.002853 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.008674 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.009712 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.010258 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5msgl"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.010701 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.011192 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.008677 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.011256 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.011479 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.011683 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.011202 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.012110 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.012387 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.012549 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013072 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013186 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013227 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013238 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013256 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013372 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.013744 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.016610 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.017532 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.018096 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.018226 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.018565 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.018658 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gwgzt"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.027682 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.030196 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5cdg"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.030694 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mn8lv"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031059 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tjvv9"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031402 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031692 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qj9kj"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031711 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031771 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031840 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031851 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031842 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.031856 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.034524 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7v62c"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055540 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmfnb\" (UniqueName: \"kubernetes.io/projected/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-kube-api-access-rmfnb\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055610 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055634 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055657 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-client-ca\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055677 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-service-ca\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055694 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-dir\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055712 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxb7q\" (UniqueName: \"kubernetes.io/projected/8707fead-c468-4d30-8966-238dee410c47-kube-api-access-cxb7q\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055734 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73c1842-1fb1-46ca-98f0-da9491841fa4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055750 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-audit\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055770 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a03ac734-fdb2-4390-a5dc-1aed999390b4-audit-dir\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055792 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055813 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-serving-cert\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055837 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-service-ca-bundle\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055863 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055937 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-config\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055966 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.055998 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ftgp\" (UniqueName: \"kubernetes.io/projected/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-kube-api-access-2ftgp\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056026 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-oauth-serving-cert\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmgz\" (UniqueName: \"kubernetes.io/projected/3087a0cb-777e-4c57-b376-aa006b7853c1-kube-api-access-crmgz\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056099 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056152 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056175 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3899b43b-6fe3-4a6d-9434-9a4754669370-serving-cert\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056195 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/97b4920c-fd39-4ef5-96b6-a044e3440f62-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056217 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-oauth-config\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056234 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-config\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056315 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762rz\" (UniqueName: \"kubernetes.io/projected/fb20baf5-f4d5-4234-9552-f4d73c447fcc-kube-api-access-762rz\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056339 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056357 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-image-import-ca\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056378 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-serving-cert\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056398 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kcq\" (UniqueName: \"kubernetes.io/projected/577f43f3-8470-4de3-ab3b-9934f1deab62-kube-api-access-w9kcq\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056455 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsl2\" (UniqueName: \"kubernetes.io/projected/039f49cf-6394-4a22-ba5b-e5b681a51ca6-kube-api-access-mfsl2\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056477 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3087a0cb-777e-4c57-b376-aa006b7853c1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056525 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba8faaff-73f5-41b6-bef5-81982c44d15c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lr6c7\" (UID: \"ba8faaff-73f5-41b6-bef5-81982c44d15c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-audit-policies\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056567 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-encryption-config\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056596 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4fp\" (UniqueName: \"kubernetes.io/projected/1893ce72-b5f1-4630-8561-3637db61d45f-kube-api-access-lt4fp\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056626 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056648 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056667 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056698 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-trusted-ca-bundle\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056749 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-config\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056768 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-config\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056786 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056803 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577f43f3-8470-4de3-ab3b-9934f1deab62-serving-cert\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056841 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-config\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056863 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-etcd-client\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056882 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056948 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjtss\" (UniqueName: \"kubernetes.io/projected/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-kube-api-access-cjtss\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056970 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2djs\" (UniqueName: \"kubernetes.io/projected/c73c1842-1fb1-46ca-98f0-da9491841fa4-kube-api-access-g2djs\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.056988 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057013 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057032 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-etcd-serving-ca\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057067 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-config\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057105 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-serving-cert\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057147 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-policies\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057167 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-etcd-client\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057187 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057205 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-serving-cert\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057223 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcjj\" (UniqueName: \"kubernetes.io/projected/ba8faaff-73f5-41b6-bef5-81982c44d15c-kube-api-access-hbcjj\") pod \"cluster-samples-operator-665b6dd947-lr6c7\" (UID: \"ba8faaff-73f5-41b6-bef5-81982c44d15c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057272 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057290 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/97b4920c-fd39-4ef5-96b6-a044e3440f62-images\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9zd9\" (UniqueName: \"kubernetes.io/projected/a03ac734-fdb2-4390-a5dc-1aed999390b4-kube-api-access-l9zd9\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057327 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8707fead-c468-4d30-8966-238dee410c47-node-pullsecrets\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057343 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8707fead-c468-4d30-8966-238dee410c47-audit-dir\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057362 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/3899b43b-6fe3-4a6d-9434-9a4754669370-kube-api-access-6bmd9\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057382 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkwkg\" (UniqueName: \"kubernetes.io/projected/97b4920c-fd39-4ef5-96b6-a044e3440f62-kube-api-access-mkwkg\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057399 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzj55\" (UniqueName: \"kubernetes.io/projected/f6a82703-5aea-4d1a-a8fa-3e4393a1176b-kube-api-access-qzj55\") pod \"downloads-7954f5f757-tjvv9\" (UID: \"f6a82703-5aea-4d1a-a8fa-3e4393a1176b\") " pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057508 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb20baf5-f4d5-4234-9552-f4d73c447fcc-serving-cert\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057544 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057568 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3087a0cb-777e-4c57-b376-aa006b7853c1-serving-cert\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057587 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73c1842-1fb1-46ca-98f0-da9491841fa4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057609 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-trusted-ca\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057647 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b4920c-fd39-4ef5-96b6-a044e3440f62-config\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057669 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl88j\" (UniqueName: \"kubernetes.io/projected/8b957b57-a5bc-43d9-acf0-88eb3b539af4-kube-api-access-kl88j\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057697 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-client-ca\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057720 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1893ce72-b5f1-4630-8561-3637db61d45f-machine-approver-tls\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057737 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1893ce72-b5f1-4630-8561-3637db61d45f-auth-proxy-config\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057758 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1893ce72-b5f1-4630-8561-3637db61d45f-config\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.057796 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-encryption-config\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.073259 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.074203 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.074520 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.074744 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.075651 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gwgzt"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.079638 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.079652 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.086356 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.086495 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.087159 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.088439 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.089438 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.090302 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.090403 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.091121 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.091348 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.091668 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.090302 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.091815 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.092261 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.092769 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.092852 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.102164 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.102428 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.102547 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.102643 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.102991 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.103498 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.103568 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.103810 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.103926 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104156 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104314 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104462 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104585 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104703 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104735 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104913 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.105032 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.105316 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.105408 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.105459 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.104943 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.116329 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.116649 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.116691 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.116745 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.116816 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.117171 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.117265 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.116818 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.118404 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.118671 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.118769 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jvn7z"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.118838 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.118998 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.119240 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.120166 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121228 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121350 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121488 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121615 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121719 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121873 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.121979 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.122189 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.122322 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.122488 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.122625 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.122691 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.124430 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.125296 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.125779 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.125776 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.127782 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.129554 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.130314 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.131359 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.133811 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tjvv9"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.133971 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5cdg"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.134401 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mn8lv"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.134095 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.135029 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.135831 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.151929 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.153545 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168093 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168216 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168434 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168458 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-serving-cert\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168478 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-service-ca-bundle\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168496 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168512 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-config\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168528 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168547 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ftgp\" (UniqueName: \"kubernetes.io/projected/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-kube-api-access-2ftgp\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168564 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-oauth-serving-cert\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crmgz\" (UniqueName: \"kubernetes.io/projected/3087a0cb-777e-4c57-b376-aa006b7853c1-kube-api-access-crmgz\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168596 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168618 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168633 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3899b43b-6fe3-4a6d-9434-9a4754669370-serving-cert\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168647 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/97b4920c-fd39-4ef5-96b6-a044e3440f62-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168662 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-oauth-config\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168678 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168693 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-config\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168716 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762rz\" (UniqueName: \"kubernetes.io/projected/fb20baf5-f4d5-4234-9552-f4d73c447fcc-kube-api-access-762rz\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168733 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168749 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-image-import-ca\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168763 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-serving-cert\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168780 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kcq\" (UniqueName: \"kubernetes.io/projected/577f43f3-8470-4de3-ab3b-9934f1deab62-kube-api-access-w9kcq\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168795 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsl2\" (UniqueName: \"kubernetes.io/projected/039f49cf-6394-4a22-ba5b-e5b681a51ca6-kube-api-access-mfsl2\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168811 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3087a0cb-777e-4c57-b376-aa006b7853c1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168835 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba8faaff-73f5-41b6-bef5-81982c44d15c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lr6c7\" (UID: \"ba8faaff-73f5-41b6-bef5-81982c44d15c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168853 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-audit-policies\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168868 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-encryption-config\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168883 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4fp\" (UniqueName: \"kubernetes.io/projected/1893ce72-b5f1-4630-8561-3637db61d45f-kube-api-access-lt4fp\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168900 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168917 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168932 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168950 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-trusted-ca-bundle\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168998 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169017 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-config\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169034 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-config\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169054 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169088 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577f43f3-8470-4de3-ab3b-9934f1deab62-serving-cert\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169104 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-config\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169121 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-etcd-client\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169136 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169161 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjtss\" (UniqueName: \"kubernetes.io/projected/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-kube-api-access-cjtss\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169178 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2djs\" (UniqueName: \"kubernetes.io/projected/c73c1842-1fb1-46ca-98f0-da9491841fa4-kube-api-access-g2djs\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169194 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169212 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-etcd-serving-ca\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169248 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-config\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169262 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-serving-cert\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169277 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-policies\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169292 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-etcd-client\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169310 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169326 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-serving-cert\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169342 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcjj\" (UniqueName: \"kubernetes.io/projected/ba8faaff-73f5-41b6-bef5-81982c44d15c-kube-api-access-hbcjj\") pod \"cluster-samples-operator-665b6dd947-lr6c7\" (UID: \"ba8faaff-73f5-41b6-bef5-81982c44d15c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169358 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169375 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/97b4920c-fd39-4ef5-96b6-a044e3440f62-images\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169391 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9zd9\" (UniqueName: \"kubernetes.io/projected/a03ac734-fdb2-4390-a5dc-1aed999390b4-kube-api-access-l9zd9\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169405 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8707fead-c468-4d30-8966-238dee410c47-node-pullsecrets\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169420 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8707fead-c468-4d30-8966-238dee410c47-audit-dir\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169441 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/3899b43b-6fe3-4a6d-9434-9a4754669370-kube-api-access-6bmd9\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169463 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkwkg\" (UniqueName: \"kubernetes.io/projected/97b4920c-fd39-4ef5-96b6-a044e3440f62-kube-api-access-mkwkg\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169484 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzj55\" (UniqueName: \"kubernetes.io/projected/f6a82703-5aea-4d1a-a8fa-3e4393a1176b-kube-api-access-qzj55\") pod \"downloads-7954f5f757-tjvv9\" (UID: \"f6a82703-5aea-4d1a-a8fa-3e4393a1176b\") " pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169512 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb20baf5-f4d5-4234-9552-f4d73c447fcc-serving-cert\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169531 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169548 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3087a0cb-777e-4c57-b376-aa006b7853c1-serving-cert\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169564 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73c1842-1fb1-46ca-98f0-da9491841fa4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169578 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-trusted-ca\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169596 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b4920c-fd39-4ef5-96b6-a044e3440f62-config\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169611 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl88j\" (UniqueName: \"kubernetes.io/projected/8b957b57-a5bc-43d9-acf0-88eb3b539af4-kube-api-access-kl88j\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169627 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-client-ca\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169642 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1893ce72-b5f1-4630-8561-3637db61d45f-machine-approver-tls\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169657 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1893ce72-b5f1-4630-8561-3637db61d45f-auth-proxy-config\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169674 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1893ce72-b5f1-4630-8561-3637db61d45f-config\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169699 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-encryption-config\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169717 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169737 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmfnb\" (UniqueName: \"kubernetes.io/projected/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-kube-api-access-rmfnb\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169755 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169772 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169787 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-client-ca\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169804 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-service-ca\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169818 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-dir\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169834 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxb7q\" (UniqueName: \"kubernetes.io/projected/8707fead-c468-4d30-8966-238dee410c47-kube-api-access-cxb7q\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169850 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73c1842-1fb1-46ca-98f0-da9491841fa4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169865 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-audit\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169884 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a03ac734-fdb2-4390-a5dc-1aed999390b4-audit-dir\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.169957 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a03ac734-fdb2-4390-a5dc-1aed999390b4-audit-dir\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.171226 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.173007 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.175229 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-service-ca-bundle\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.175916 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.176233 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-serving-cert\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.176709 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-config\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.176715 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3087a0cb-777e-4c57-b376-aa006b7853c1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.177283 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.177962 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-oauth-serving-cert\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.179842 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-config\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.180111 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.180345 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-config\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.180812 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-config\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.180832 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.181300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-etcd-serving-ca\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.181867 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-config\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.182855 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-trusted-ca\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.182871 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba8faaff-73f5-41b6-bef5-81982c44d15c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lr6c7\" (UID: \"ba8faaff-73f5-41b6-bef5-81982c44d15c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.182983 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-config\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183276 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577f43f3-8470-4de3-ab3b-9934f1deab62-serving-cert\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183302 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183602 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183682 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b4920c-fd39-4ef5-96b6-a044e3440f62-config\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183794 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-client-ca\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.183978 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1893ce72-b5f1-4630-8561-3637db61d45f-config\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.184176 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a03ac734-fdb2-4390-a5dc-1aed999390b4-audit-policies\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.184600 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.184984 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.185377 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.185827 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-etcd-client\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.185987 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-encryption-config\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.186011 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c73c1842-1fb1-46ca-98f0-da9491841fa4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.186088 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8707fead-c468-4d30-8966-238dee410c47-node-pullsecrets\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.186235 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.186417 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8707fead-c468-4d30-8966-238dee410c47-audit-dir\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.168708 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5msgl"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.187567 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-image-import-ca\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.188668 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb20baf5-f4d5-4234-9552-f4d73c447fcc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.188844 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/97b4920c-fd39-4ef5-96b6-a044e3440f62-images\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.188988 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1893ce72-b5f1-4630-8561-3637db61d45f-auth-proxy-config\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.189300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-dir\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.189416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.189936 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-client-ca\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-service-ca\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190195 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-serving-cert\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190469 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c73c1842-1fb1-46ca-98f0-da9491841fa4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190712 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190754 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8707fead-c468-4d30-8966-238dee410c47-audit\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190799 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-policies\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.190855 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb20baf5-f4d5-4234-9552-f4d73c447fcc-serving-cert\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.191240 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-trusted-ca-bundle\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.191304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/97b4920c-fd39-4ef5-96b6-a044e3440f62-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.191678 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.191843 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3899b43b-6fe3-4a6d-9434-9a4754669370-serving-cert\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.192867 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-encryption-config\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.192974 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3087a0cb-777e-4c57-b376-aa006b7853c1-serving-cert\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.193008 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8707fead-c468-4d30-8966-238dee410c47-etcd-client\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.193368 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-serving-cert\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.195117 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1893ce72-b5f1-4630-8561-3637db61d45f-machine-approver-tls\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.195216 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2djs\" (UniqueName: \"kubernetes.io/projected/c73c1842-1fb1-46ca-98f0-da9491841fa4-kube-api-access-g2djs\") pod \"openshift-apiserver-operator-796bbdcf4f-h2txk\" (UID: \"c73c1842-1fb1-46ca-98f0-da9491841fa4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.196277 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-oauth-config\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.196365 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a03ac734-fdb2-4390-a5dc-1aed999390b4-serving-cert\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.197258 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.197674 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.206154 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.207036 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.213587 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kcq\" (UniqueName: \"kubernetes.io/projected/577f43f3-8470-4de3-ab3b-9934f1deab62-kube-api-access-w9kcq\") pod \"route-controller-manager-6576b87f9c-bz475\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.233219 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsl2\" (UniqueName: \"kubernetes.io/projected/039f49cf-6394-4a22-ba5b-e5b681a51ca6-kube-api-access-mfsl2\") pod \"console-f9d7485db-qj9kj\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.247423 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dqws6"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.247879 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.248369 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.249092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.252842 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ppzj9"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.253424 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.254509 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.255525 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qvxnd"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.256052 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f8vfq"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.256444 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.257855 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.258830 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.262704 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.263347 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tbzlt"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.263640 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.263866 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.264125 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.264650 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.265884 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.266280 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.282027 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.284450 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.286518 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmgz\" (UniqueName: \"kubernetes.io/projected/3087a0cb-777e-4c57-b376-aa006b7853c1-kube-api-access-crmgz\") pod \"openshift-config-operator-7777fb866f-jkzsn\" (UID: \"3087a0cb-777e-4c57-b376-aa006b7853c1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.290831 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.292086 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ftgp\" (UniqueName: \"kubernetes.io/projected/e432bbfa-f0e0-4e77-9e91-ad467795a8fe-kube-api-access-2ftgp\") pod \"openshift-controller-manager-operator-756b6f6bc6-k4d99\" (UID: \"e432bbfa-f0e0-4e77-9e91-ad467795a8fe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.292568 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.301162 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.306386 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.327016 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.337556 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.337929 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.337992 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmfnb\" (UniqueName: \"kubernetes.io/projected/b8f9d516-9c35-42b4-a4d5-e6d189053b5e-kube-api-access-rmfnb\") pod \"console-operator-58897d9998-gwgzt\" (UID: \"b8f9d516-9c35-42b4-a4d5-e6d189053b5e\") " pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.338152 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.338153 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.339034 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjtss\" (UniqueName: \"kubernetes.io/projected/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-kube-api-access-cjtss\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.339161 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.339124 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.340230 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ppzj9"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.340251 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dqws6"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.340260 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qvxnd"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.340270 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-67brg"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.341285 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.341797 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.341895 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.342241 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.343008 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dztht"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.343708 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.343901 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl88j\" (UniqueName: \"kubernetes.io/projected/8b957b57-a5bc-43d9-acf0-88eb3b539af4-kube-api-access-kl88j\") pod \"oauth-openshift-558db77b4-5msgl\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.344615 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.345469 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.345759 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45gg9"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.346540 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.346838 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pcbgg"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.348625 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.349044 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5676h"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.349091 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.349389 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.349584 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.349736 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.350136 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.351319 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lg5bc"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.352040 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.352122 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.352539 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.353581 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tbzlt"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.354613 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.356145 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.356257 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.356843 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9zd9\" (UniqueName: \"kubernetes.io/projected/a03ac734-fdb2-4390-a5dc-1aed999390b4-kube-api-access-l9zd9\") pod \"apiserver-7bbb656c7d-lkk22\" (UID: \"a03ac734-fdb2-4390-a5dc-1aed999390b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.357308 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qxndq"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.357874 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.358272 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g56kn"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.359373 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.359517 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.360629 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-67brg"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.361946 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.363391 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.365188 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.365730 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45gg9"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.366681 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.367689 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.367856 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.369211 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pcbgg"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.370214 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.370850 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.371915 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qxndq"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.372979 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5676h"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.373299 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4fp\" (UniqueName: \"kubernetes.io/projected/1893ce72-b5f1-4630-8561-3637db61d45f-kube-api-access-lt4fp\") pod \"machine-approver-56656f9798-r7n7z\" (UID: \"1893ce72-b5f1-4630-8561-3637db61d45f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.373974 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dztht"] Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.393736 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/3899b43b-6fe3-4a6d-9434-9a4754669370-kube-api-access-6bmd9\") pod \"controller-manager-879f6c89f-z5cdg\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.418681 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.424698 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkwkg\" (UniqueName: \"kubernetes.io/projected/97b4920c-fd39-4ef5-96b6-a044e3440f62-kube-api-access-mkwkg\") pod \"machine-api-operator-5694c8668f-7v62c\" (UID: \"97b4920c-fd39-4ef5-96b6-a044e3440f62\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.431982 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:51 crc kubenswrapper[4740]: W0105 13:49:51.433156 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1893ce72_b5f1_4630_8561_3637db61d45f.slice/crio-9e189f76afc1fb09697d117deaf8dfb9af862c25d56064eadacb51006ac6d79a WatchSource:0}: Error finding container 9e189f76afc1fb09697d117deaf8dfb9af862c25d56064eadacb51006ac6d79a: Status 404 returned error can't find the container with id 9e189f76afc1fb09697d117deaf8dfb9af862c25d56064eadacb51006ac6d79a Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.433230 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762rz\" (UniqueName: \"kubernetes.io/projected/fb20baf5-f4d5-4234-9552-f4d73c447fcc-kube-api-access-762rz\") pod \"authentication-operator-69f744f599-mn8lv\" (UID: \"fb20baf5-f4d5-4234-9552-f4d73c447fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.458420 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzj55\" (UniqueName: \"kubernetes.io/projected/f6a82703-5aea-4d1a-a8fa-3e4393a1176b-kube-api-access-qzj55\") pod \"downloads-7954f5f757-tjvv9\" (UID: \"f6a82703-5aea-4d1a-a8fa-3e4393a1176b\") " pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.475098 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcjj\" (UniqueName: \"kubernetes.io/projected/ba8faaff-73f5-41b6-bef5-81982c44d15c-kube-api-access-hbcjj\") pod \"cluster-samples-operator-665b6dd947-lr6c7\" (UID: \"ba8faaff-73f5-41b6-bef5-81982c44d15c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.482507 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.513366 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3f0bd5a-4767-4018-8a26-1c00a4f5bd58-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jd92l\" (UID: \"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.533643 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxb7q\" (UniqueName: \"kubernetes.io/projected/8707fead-c468-4d30-8966-238dee410c47-kube-api-access-cxb7q\") pod \"apiserver-76f77b778f-jvn7z\" (UID: \"8707fead-c468-4d30-8966-238dee410c47\") " pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.539716 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.558159 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.559816 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.580691 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.590412 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.607903 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.620900 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.629189 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.641760 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.642931 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.650182 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.651284 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.661391 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.680695 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.681276 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.701384 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.708021 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.721032 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.740324 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.775004 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.780021 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.800917 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.821748 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.840673 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.861306 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.883821 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.900165 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.920426 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.941756 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.960385 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.967805 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.968472 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.968549 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.968684 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:49:51 crc kubenswrapper[4740]: I0105 13:49:51.980188 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.000784 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.019585 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.037914 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qj9kj"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.039765 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.046524 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod039f49cf_6394_4a22_ba5b_e5b681a51ca6.slice/crio-5f86689a6ce4cab1d74ce74a7ca2c537828587ffc1ef96a477fb74ee01f31d28 WatchSource:0}: Error finding container 5f86689a6ce4cab1d74ce74a7ca2c537828587ffc1ef96a477fb74ee01f31d28: Status 404 returned error can't find the container with id 5f86689a6ce4cab1d74ce74a7ca2c537828587ffc1ef96a477fb74ee01f31d28 Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.053669 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5cdg"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.059763 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.079960 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.099955 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.121524 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.141248 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.160898 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.180377 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.201289 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.222906 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.240796 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.240902 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gwgzt"] Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.248822 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f9d516_9c35_42b4_a4d5_e6d189053b5e.slice/crio-23f7ae24b9d844dccc95db4958bfcd0467d5767f3db518ff33c17055e2c83212 WatchSource:0}: Error finding container 23f7ae24b9d844dccc95db4958bfcd0467d5767f3db518ff33c17055e2c83212: Status 404 returned error can't find the container with id 23f7ae24b9d844dccc95db4958bfcd0467d5767f3db518ff33c17055e2c83212 Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.253485 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7v62c"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.259947 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" event={"ID":"1893ce72-b5f1-4630-8561-3637db61d45f","Type":"ContainerStarted","Data":"561be679df2696606d8d05360c708182460c2f938c445522ed605204bcfc5d39"} Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.260005 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" event={"ID":"1893ce72-b5f1-4630-8561-3637db61d45f","Type":"ContainerStarted","Data":"9e189f76afc1fb09697d117deaf8dfb9af862c25d56064eadacb51006ac6d79a"} Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.260132 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.277494 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" event={"ID":"3899b43b-6fe3-4a6d-9434-9a4754669370","Type":"ContainerStarted","Data":"38e5a5cd0c475a41f25bb67ef0fc813fb8b47f539c52cef9b63c998902ccb58a"} Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.278732 4740 request.go:700] Waited for 1.014215075s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Detcd-ca-bundle&limit=500&resourceVersion=0 Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.279596 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qj9kj" event={"ID":"039f49cf-6394-4a22-ba5b-e5b681a51ca6","Type":"ContainerStarted","Data":"5f86689a6ce4cab1d74ce74a7ca2c537828587ffc1ef96a477fb74ee01f31d28"} Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.281130 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.284369 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" event={"ID":"b8f9d516-9c35-42b4-a4d5-e6d189053b5e","Type":"ContainerStarted","Data":"23f7ae24b9d844dccc95db4958bfcd0467d5767f3db518ff33c17055e2c83212"} Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.294425 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.300115 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.320469 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.340645 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.360614 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.381040 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.396899 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.409966 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.412657 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.414370 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.415253 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5msgl"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.416455 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tjvv9"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.419963 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.422305 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.424675 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jvn7z"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.426325 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.427043 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7"] Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.432862 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mn8lv"] Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.438026 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode432bbfa_f0e0_4e77_9e91_ad467795a8fe.slice/crio-a248c5a09d4bffee36fba121d72c1cfae96773694971715b506df6e4f81a3c8a WatchSource:0}: Error finding container a248c5a09d4bffee36fba121d72c1cfae96773694971715b506df6e4f81a3c8a: Status 404 returned error can't find the container with id a248c5a09d4bffee36fba121d72c1cfae96773694971715b506df6e4f81a3c8a Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.440783 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.459956 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.462001 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3f0bd5a_4767_4018_8a26_1c00a4f5bd58.slice/crio-cc987458239d06375874c231d4cdfb03a8b0104f7fd5363a26c1df083954a01b WatchSource:0}: Error finding container cc987458239d06375874c231d4cdfb03a8b0104f7fd5363a26c1df083954a01b: Status 404 returned error can't find the container with id cc987458239d06375874c231d4cdfb03a8b0104f7fd5363a26c1df083954a01b Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.462771 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a82703_5aea_4d1a_a8fa_3e4393a1176b.slice/crio-c874e0815568e5593c197bc07d722b6d5ef34f2fc9677e2c683b77fad579052c WatchSource:0}: Error finding container c874e0815568e5593c197bc07d722b6d5ef34f2fc9677e2c683b77fad579052c: Status 404 returned error can't find the container with id c874e0815568e5593c197bc07d722b6d5ef34f2fc9677e2c683b77fad579052c Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.481131 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.499532 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.519779 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.540118 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.562500 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.580742 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.600237 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.621315 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.627203 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8707fead_c468_4d30_8966_238dee410c47.slice/crio-40f8388eb4e1b9000e93e9def1b26f31d56c0a7b34e7a982554d194f21567dac WatchSource:0}: Error finding container 40f8388eb4e1b9000e93e9def1b26f31d56c0a7b34e7a982554d194f21567dac: Status 404 returned error can't find the container with id 40f8388eb4e1b9000e93e9def1b26f31d56c0a7b34e7a982554d194f21567dac Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.627593 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b957b57_a5bc_43d9_acf0_88eb3b539af4.slice/crio-89283df55b398c863ed0c898cba5ab290a3f05873ee0faae6152dd0528590d8f WatchSource:0}: Error finding container 89283df55b398c863ed0c898cba5ab290a3f05873ee0faae6152dd0528590d8f: Status 404 returned error can't find the container with id 89283df55b398c863ed0c898cba5ab290a3f05873ee0faae6152dd0528590d8f Jan 05 13:49:52 crc kubenswrapper[4740]: W0105 13:49:52.628328 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03ac734_fdb2_4390_a5dc_1aed999390b4.slice/crio-4ef0f6a99500cda7e92e32baf32fae3db57662cea9db770f2bec56cbbaa6f1ae WatchSource:0}: Error finding container 4ef0f6a99500cda7e92e32baf32fae3db57662cea9db770f2bec56cbbaa6f1ae: Status 404 returned error can't find the container with id 4ef0f6a99500cda7e92e32baf32fae3db57662cea9db770f2bec56cbbaa6f1ae Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.640739 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.659822 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.681930 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.700548 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.720439 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.741302 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.760257 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.781810 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.803489 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.820382 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.840874 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.865403 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.880620 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.900546 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.919998 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.940345 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.960470 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 13:49:52 crc kubenswrapper[4740]: I0105 13:49:52.981838 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.000509 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.020140 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.039821 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.069542 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.082592 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.100093 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.120442 4740 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.140253 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.159697 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.180455 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.199626 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.221503 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.241209 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.260566 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.280180 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.288659 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" event={"ID":"3899b43b-6fe3-4a6d-9434-9a4754669370","Type":"ContainerStarted","Data":"a7e9eb90bacd6d3a4f33afac26d9f1927ef9103dffee4c458fcd269d9d9ce11c"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.288847 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.291289 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" event={"ID":"ba8faaff-73f5-41b6-bef5-81982c44d15c","Type":"ContainerStarted","Data":"d72ad6bbe1f273c62788ceb08a58089892739041e17c0bd53570f72e1f202df3"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.291327 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" event={"ID":"ba8faaff-73f5-41b6-bef5-81982c44d15c","Type":"ContainerStarted","Data":"27cab93e8cd72d5cd4bbed1ad51ac26fe29e57f33022ba26e92a24f130b28200"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.291339 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" event={"ID":"ba8faaff-73f5-41b6-bef5-81982c44d15c","Type":"ContainerStarted","Data":"92b909da3938951a801e474e9f28acc0470b2b4e91931a520e14fccee64ed98a"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.293231 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" event={"ID":"c73c1842-1fb1-46ca-98f0-da9491841fa4","Type":"ContainerStarted","Data":"f3ab465a3781489686634d9fd557268600030a62ef62cec37f4e298a64f1780a"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.293254 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" event={"ID":"c73c1842-1fb1-46ca-98f0-da9491841fa4","Type":"ContainerStarted","Data":"093dba3ce1c18f76b1e000a4eac5345c245725b42aa062dd241bfe58ba6f48ba"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.294706 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" event={"ID":"e432bbfa-f0e0-4e77-9e91-ad467795a8fe","Type":"ContainerStarted","Data":"83a191579118fcad3ff088b76b2cef3f97c2d50d47a188cf84a64fdb39449f84"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.294726 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" event={"ID":"e432bbfa-f0e0-4e77-9e91-ad467795a8fe","Type":"ContainerStarted","Data":"a248c5a09d4bffee36fba121d72c1cfae96773694971715b506df6e4f81a3c8a"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.296084 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" event={"ID":"97b4920c-fd39-4ef5-96b6-a044e3440f62","Type":"ContainerStarted","Data":"3e48a810a4d79094782b0df858fec6a0afa213d95129d04255f00b1f4bfe8f98"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.296107 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" event={"ID":"97b4920c-fd39-4ef5-96b6-a044e3440f62","Type":"ContainerStarted","Data":"4b9504dbdd4c05edd3166fa7c132e57f41db36a685c4a672a24be7de5f5d84c2"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.296117 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" event={"ID":"97b4920c-fd39-4ef5-96b6-a044e3440f62","Type":"ContainerStarted","Data":"161c48973bfbd1a8dbf9b2f96db1152565c103092cf087d967c4d4174507f4a5"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.296592 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.298235 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" event={"ID":"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58","Type":"ContainerStarted","Data":"e5b1612275ce8571971f64ebabae673da8e950ec50d3dcf1ad8b0097b3586961"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.298258 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" event={"ID":"e3f0bd5a-4767-4018-8a26-1c00a4f5bd58","Type":"ContainerStarted","Data":"cc987458239d06375874c231d4cdfb03a8b0104f7fd5363a26c1df083954a01b"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.298333 4740 request.go:700] Waited for 1.946068588s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/configmaps?fieldSelector=metadata.name%3Dcni-sysctl-allowlist&limit=500&resourceVersion=0 Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.299506 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.299715 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qj9kj" event={"ID":"039f49cf-6394-4a22-ba5b-e5b681a51ca6","Type":"ContainerStarted","Data":"308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.301887 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tjvv9" event={"ID":"f6a82703-5aea-4d1a-a8fa-3e4393a1176b","Type":"ContainerStarted","Data":"362f58e64ce41e5781f3ba969f62e4427436d4534306e8cf23ced44611577ab5"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.301909 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tjvv9" event={"ID":"f6a82703-5aea-4d1a-a8fa-3e4393a1176b","Type":"ContainerStarted","Data":"c874e0815568e5593c197bc07d722b6d5ef34f2fc9677e2c683b77fad579052c"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.302194 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.303374 4740 generic.go:334] "Generic (PLEG): container finished" podID="a03ac734-fdb2-4390-a5dc-1aed999390b4" containerID="c08ac9959880c32d61e1a512707759f141178de00a3778eed23a31715a3352a6" exitCode=0 Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.303426 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" event={"ID":"a03ac734-fdb2-4390-a5dc-1aed999390b4","Type":"ContainerDied","Data":"c08ac9959880c32d61e1a512707759f141178de00a3778eed23a31715a3352a6"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.303587 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" event={"ID":"a03ac734-fdb2-4390-a5dc-1aed999390b4","Type":"ContainerStarted","Data":"4ef0f6a99500cda7e92e32baf32fae3db57662cea9db770f2bec56cbbaa6f1ae"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.303818 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-tjvv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.303849 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tjvv9" podUID="f6a82703-5aea-4d1a-a8fa-3e4393a1176b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.306026 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" event={"ID":"8b957b57-a5bc-43d9-acf0-88eb3b539af4","Type":"ContainerStarted","Data":"bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.306067 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" event={"ID":"8b957b57-a5bc-43d9-acf0-88eb3b539af4","Type":"ContainerStarted","Data":"89283df55b398c863ed0c898cba5ab290a3f05873ee0faae6152dd0528590d8f"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.306604 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.314181 4740 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5msgl container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" start-of-body= Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.314227 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" podUID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.315713 4740 generic.go:334] "Generic (PLEG): container finished" podID="3087a0cb-777e-4c57-b376-aa006b7853c1" containerID="a5387c564d3b64f8bbaf1533d92223c98239d7c35856f5ab6a5c2d26d97e7d02" exitCode=0 Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.315773 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" event={"ID":"3087a0cb-777e-4c57-b376-aa006b7853c1","Type":"ContainerDied","Data":"a5387c564d3b64f8bbaf1533d92223c98239d7c35856f5ab6a5c2d26d97e7d02"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.315800 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" event={"ID":"3087a0cb-777e-4c57-b376-aa006b7853c1","Type":"ContainerStarted","Data":"b965256608cb7172605296eafb8becdb73419c7c6cb67c6c5624b25c87c25f47"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.322008 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" event={"ID":"fb20baf5-f4d5-4234-9552-f4d73c447fcc","Type":"ContainerStarted","Data":"a53d6d483e52d8db7e3b594ef4b0aeebecc626bd228672ccefde05069df831ca"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.322044 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" event={"ID":"fb20baf5-f4d5-4234-9552-f4d73c447fcc","Type":"ContainerStarted","Data":"116b7f7ab685344a6db0bb38d879c98acf09887af1afd75d83c21589331ba238"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.324483 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.327611 4740 generic.go:334] "Generic (PLEG): container finished" podID="8707fead-c468-4d30-8966-238dee410c47" containerID="7183e4fc87fe1bbcf9000351cc7bb37e75809f7ac89566c9aa783f7ea53b18e6" exitCode=0 Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.328189 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" event={"ID":"8707fead-c468-4d30-8966-238dee410c47","Type":"ContainerDied","Data":"7183e4fc87fe1bbcf9000351cc7bb37e75809f7ac89566c9aa783f7ea53b18e6"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.328220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" event={"ID":"8707fead-c468-4d30-8966-238dee410c47","Type":"ContainerStarted","Data":"40f8388eb4e1b9000e93e9def1b26f31d56c0a7b34e7a982554d194f21567dac"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.343661 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.345823 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" event={"ID":"1893ce72-b5f1-4630-8561-3637db61d45f","Type":"ContainerStarted","Data":"99753b2f51142d96b0b92744f59bcf6fc7db2660f4b8cccedc55cd4c16ea33d3"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.349713 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" event={"ID":"b8f9d516-9c35-42b4-a4d5-e6d189053b5e","Type":"ContainerStarted","Data":"1cb8ce148fa643ebf57fc324bdbfab16b67bc36f03b57a86e7eb031a2cfc9617"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.350379 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.351616 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" event={"ID":"577f43f3-8470-4de3-ab3b-9934f1deab62","Type":"ContainerStarted","Data":"20179e62d0bc0fe542f80a38a88a74790d63cf1372f5ac028d2d5381f64d3d85"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.351670 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" event={"ID":"577f43f3-8470-4de3-ab3b-9934f1deab62","Type":"ContainerStarted","Data":"569e7b64f94c4a85040b23ac071175a02175e65ef0170c9ec39d1264d520cc6f"} Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.351689 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.361198 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.383627 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.402929 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.425011 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.445388 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.500670 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.520012 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.532611 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-trusted-ca\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.532665 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrhsk\" (UniqueName: \"kubernetes.io/projected/f27a472e-ea53-41f1-9360-84b2f9f0fd36-kube-api-access-xrhsk\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.532796 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27a472e-ea53-41f1-9360-84b2f9f0fd36-trusted-ca\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.532906 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.532935 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt2vz\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-kube-api-access-xt2vz\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.532963 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f488db4f-55ab-4654-b225-38742d36877c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533021 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-config-volume\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533101 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnn4j\" (UniqueName: \"kubernetes.io/projected/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-kube-api-access-lnn4j\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533265 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f488db4f-55ab-4654-b225-38742d36877c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533287 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-bound-sa-token\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533302 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f27a472e-ea53-41f1-9360-84b2f9f0fd36-bound-sa-token\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533328 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-registry-tls\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533431 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-metrics-tls\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533565 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f27a472e-ea53-41f1-9360-84b2f9f0fd36-metrics-tls\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.533685 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-registry-certificates\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: E0105 13:49:53.535354 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.035331955 +0000 UTC m=+43.342240534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.549006 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.563172 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.579927 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.605967 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635128 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635322 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-trusted-ca\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635356 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-default-certificate\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635377 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1759d33c-e10d-4db8-91aa-60a30ff68255-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635395 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrhsk\" (UniqueName: \"kubernetes.io/projected/f27a472e-ea53-41f1-9360-84b2f9f0fd36-kube-api-access-xrhsk\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khsbr\" (UniqueName: \"kubernetes.io/projected/817177d4-0bf1-4fa4-babc-86b3bc19af23-kube-api-access-khsbr\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635429 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgrm\" (UniqueName: \"kubernetes.io/projected/eb24541e-e924-4085-b91e-e2d5a0bc8349-kube-api-access-xfgrm\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635459 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e922e0-65fb-449b-a90a-793b79189089-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635475 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e922e0-65fb-449b-a90a-793b79189089-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635490 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635516 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdv9\" (UniqueName: \"kubernetes.io/projected/3661c83f-42d5-4441-95c9-bc94757cc85d-kube-api-access-zrdv9\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635532 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt2vz\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-kube-api-access-xt2vz\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvk4f\" (UniqueName: \"kubernetes.io/projected/b43916b9-e413-4a80-880a-3feee7227ec5-kube-api-access-nvk4f\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635564 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnn4j\" (UniqueName: \"kubernetes.io/projected/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-kube-api-access-lnn4j\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635607 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635622 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06c65d1-809e-4e4f-922b-999763b9dd7a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635635 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1759d33c-e10d-4db8-91aa-60a30ff68255-ready\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635651 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-registration-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635664 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-signing-key\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635678 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1759d33c-e10d-4db8-91aa-60a30ff68255-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635707 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f488db4f-55ab-4654-b225-38742d36877c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635722 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817177d4-0bf1-4fa4-babc-86b3bc19af23-config\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635739 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-registry-tls\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635753 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-images\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635768 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9768b475-a5b9-407b-b87b-b2a7b5a90260-certs\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635807 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9w64\" (UniqueName: \"kubernetes.io/projected/8111c300-7d33-4696-a1f1-cb7e5ebfaa87-kube-api-access-h9w64\") pod \"ingress-canary-qxndq\" (UID: \"8111c300-7d33-4696-a1f1-cb7e5ebfaa87\") " pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635822 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-signing-cabundle\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635856 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-secret-volume\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635871 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8111c300-7d33-4696-a1f1-cb7e5ebfaa87-cert\") pod \"ingress-canary-qxndq\" (UID: \"8111c300-7d33-4696-a1f1-cb7e5ebfaa87\") " pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635886 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7wg5\" (UniqueName: \"kubernetes.io/projected/9768b475-a5b9-407b-b87b-b2a7b5a90260-kube-api-access-d7wg5\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635900 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmph\" (UniqueName: \"kubernetes.io/projected/80bfed68-6820-4458-aa8a-779cc3120e43-kube-api-access-2mmph\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635914 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfcw\" (UniqueName: \"kubernetes.io/projected/1759d33c-e10d-4db8-91aa-60a30ff68255-kube-api-access-cvfcw\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f27a472e-ea53-41f1-9360-84b2f9f0fd36-metrics-tls\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635954 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-ca\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635968 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635982 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5cn\" (UniqueName: \"kubernetes.io/projected/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-kube-api-access-cj5cn\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.635996 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-mountpoint-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636020 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4tj\" (UniqueName: \"kubernetes.io/projected/b9326591-f868-4939-9806-a5f09e56a0d0-kube-api-access-dw4tj\") pod \"migrator-59844c95c7-lwmk6\" (UID: \"b9326591-f868-4939-9806-a5f09e56a0d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636040 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-registry-certificates\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636063 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e922e0-65fb-449b-a90a-793b79189089-config\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636101 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35423004-ee05-40a4-9125-6b948b8371e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636117 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d06c65d1-809e-4e4f-922b-999763b9dd7a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636149 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb24541e-e924-4085-b91e-e2d5a0bc8349-profile-collector-cert\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8255876c-990a-4658-8d74-66b4d45e379c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636179 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-service-ca\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636195 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8l6d\" (UniqueName: \"kubernetes.io/projected/8255876c-990a-4658-8d74-66b4d45e379c-kube-api-access-c8l6d\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636211 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms5xt\" (UniqueName: \"kubernetes.io/projected/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-kube-api-access-ms5xt\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636233 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27a472e-ea53-41f1-9360-84b2f9f0fd36-trusted-ca\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636249 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ccfc6e-26f3-473d-9a65-40ca39aafca2-service-ca-bundle\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636272 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-config-volume\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636287 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-socket-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636316 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-client\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636333 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f488db4f-55ab-4654-b225-38742d36877c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636347 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b43916b9-e413-4a80-880a-3feee7227ec5-webhook-cert\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636372 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-config-volume\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636389 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4h49\" (UniqueName: \"kubernetes.io/projected/d06c65d1-809e-4e4f-922b-999763b9dd7a-kube-api-access-g4h49\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b43916b9-e413-4a80-880a-3feee7227ec5-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636423 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb24541e-e924-4085-b91e-e2d5a0bc8349-srv-cert\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636439 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7d9be6f-7415-4c1a-a6b9-6f5222d5580f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkdl5\" (UID: \"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636457 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9656e5b3-4b38-4441-b2e2-13d3d2a31100-metrics-tls\") pod \"dns-operator-744455d44c-qvxnd\" (UID: \"9656e5b3-4b38-4441-b2e2-13d3d2a31100\") " pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636486 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnbjh\" (UniqueName: \"kubernetes.io/projected/9656e5b3-4b38-4441-b2e2-13d3d2a31100-kube-api-access-pnbjh\") pod \"dns-operator-744455d44c-qvxnd\" (UID: \"9656e5b3-4b38-4441-b2e2-13d3d2a31100\") " pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636518 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-429zf\" (UniqueName: \"kubernetes.io/projected/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-kube-api-access-429zf\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636532 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-plugins-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636548 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-bound-sa-token\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636564 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f27a472e-ea53-41f1-9360-84b2f9f0fd36-bound-sa-token\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636579 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8255876c-990a-4658-8d74-66b4d45e379c-srv-cert\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636593 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-stats-auth\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636614 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-metrics-certs\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636647 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9n85\" (UniqueName: \"kubernetes.io/projected/89ec45a7-04a1-4913-b5e0-9ebc1d04f46c-kube-api-access-m9n85\") pod \"package-server-manager-789f6589d5-46x82\" (UID: \"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636687 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpr4\" (UniqueName: \"kubernetes.io/projected/5b944b6c-4c73-4527-8d67-b9a2a4004606-kube-api-access-dfpr4\") pod \"multus-admission-controller-857f4d67dd-67brg\" (UID: \"5b944b6c-4c73-4527-8d67-b9a2a4004606\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636703 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/89ec45a7-04a1-4913-b5e0-9ebc1d04f46c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-46x82\" (UID: \"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6af3027-f0d8-4616-8622-b07d4b8f0b94-proxy-tls\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636742 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v548x\" (UniqueName: \"kubernetes.io/projected/f6af3027-f0d8-4616-8622-b07d4b8f0b94-kube-api-access-v548x\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636759 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35423004-ee05-40a4-9125-6b948b8371e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636773 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnnrk\" (UniqueName: \"kubernetes.io/projected/d7d9be6f-7415-4c1a-a6b9-6f5222d5580f-kube-api-access-pnnrk\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkdl5\" (UID: \"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636789 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-config\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636807 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-metrics-tls\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636821 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/817177d4-0bf1-4fa4-babc-86b3bc19af23-serving-cert\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636838 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-csi-data-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636854 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5b944b6c-4c73-4527-8d67-b9a2a4004606-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-67brg\" (UID: \"5b944b6c-4c73-4527-8d67-b9a2a4004606\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636869 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b43916b9-e413-4a80-880a-3feee7227ec5-tmpfs\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636894 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-proxy-tls\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636909 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-config\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636924 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35423004-ee05-40a4-9125-6b948b8371e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636941 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-serving-cert\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636957 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf29w\" (UniqueName: \"kubernetes.io/projected/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-kube-api-access-wf29w\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6af3027-f0d8-4616-8622-b07d4b8f0b94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.636997 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9768b475-a5b9-407b-b87b-b2a7b5a90260-node-bootstrap-token\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.637011 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrkg\" (UniqueName: \"kubernetes.io/projected/61ccfc6e-26f3-473d-9a65-40ca39aafca2-kube-api-access-xlrkg\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: E0105 13:49:53.637533 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.137518824 +0000 UTC m=+43.444427403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.638499 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-trusted-ca\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.644559 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f488db4f-55ab-4654-b225-38742d36877c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.644732 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27a472e-ea53-41f1-9360-84b2f9f0fd36-trusted-ca\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.646392 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-config-volume\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.654883 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f27a472e-ea53-41f1-9360-84b2f9f0fd36-metrics-tls\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.658437 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f488db4f-55ab-4654-b225-38742d36877c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.664910 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-metrics-tls\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.665173 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-registry-certificates\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.668818 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-registry-tls\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.694779 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrhsk\" (UniqueName: \"kubernetes.io/projected/f27a472e-ea53-41f1-9360-84b2f9f0fd36-kube-api-access-xrhsk\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.724767 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f27a472e-ea53-41f1-9360-84b2f9f0fd36-bound-sa-token\") pod \"ingress-operator-5b745b69d9-75w2r\" (UID: \"f27a472e-ea53-41f1-9360-84b2f9f0fd36\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.728526 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-bound-sa-token\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737714 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817177d4-0bf1-4fa4-babc-86b3bc19af23-config\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737753 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-images\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737772 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9768b475-a5b9-407b-b87b-b2a7b5a90260-certs\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737799 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9w64\" (UniqueName: \"kubernetes.io/projected/8111c300-7d33-4696-a1f1-cb7e5ebfaa87-kube-api-access-h9w64\") pod \"ingress-canary-qxndq\" (UID: \"8111c300-7d33-4696-a1f1-cb7e5ebfaa87\") " pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737814 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-signing-cabundle\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737831 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-secret-volume\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737848 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8111c300-7d33-4696-a1f1-cb7e5ebfaa87-cert\") pod \"ingress-canary-qxndq\" (UID: \"8111c300-7d33-4696-a1f1-cb7e5ebfaa87\") " pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737863 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7wg5\" (UniqueName: \"kubernetes.io/projected/9768b475-a5b9-407b-b87b-b2a7b5a90260-kube-api-access-d7wg5\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737878 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmph\" (UniqueName: \"kubernetes.io/projected/80bfed68-6820-4458-aa8a-779cc3120e43-kube-api-access-2mmph\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737916 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfcw\" (UniqueName: \"kubernetes.io/projected/1759d33c-e10d-4db8-91aa-60a30ff68255-kube-api-access-cvfcw\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737944 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-ca\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.737964 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738003 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5cn\" (UniqueName: \"kubernetes.io/projected/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-kube-api-access-cj5cn\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738021 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4tj\" (UniqueName: \"kubernetes.io/projected/b9326591-f868-4939-9806-a5f09e56a0d0-kube-api-access-dw4tj\") pod \"migrator-59844c95c7-lwmk6\" (UID: \"b9326591-f868-4939-9806-a5f09e56a0d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738037 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-mountpoint-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738071 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e922e0-65fb-449b-a90a-793b79189089-config\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738100 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35423004-ee05-40a4-9125-6b948b8371e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738117 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d06c65d1-809e-4e4f-922b-999763b9dd7a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738131 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb24541e-e924-4085-b91e-e2d5a0bc8349-profile-collector-cert\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738146 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8255876c-990a-4658-8d74-66b4d45e379c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-service-ca\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738178 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8l6d\" (UniqueName: \"kubernetes.io/projected/8255876c-990a-4658-8d74-66b4d45e379c-kube-api-access-c8l6d\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738191 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms5xt\" (UniqueName: \"kubernetes.io/projected/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-kube-api-access-ms5xt\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738209 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ccfc6e-26f3-473d-9a65-40ca39aafca2-service-ca-bundle\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738225 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-config-volume\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738240 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-socket-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738256 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738273 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-client\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738291 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b43916b9-e413-4a80-880a-3feee7227ec5-webhook-cert\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738309 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4h49\" (UniqueName: \"kubernetes.io/projected/d06c65d1-809e-4e4f-922b-999763b9dd7a-kube-api-access-g4h49\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738324 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b43916b9-e413-4a80-880a-3feee7227ec5-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738339 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb24541e-e924-4085-b91e-e2d5a0bc8349-srv-cert\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738355 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7d9be6f-7415-4c1a-a6b9-6f5222d5580f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkdl5\" (UID: \"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738375 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9656e5b3-4b38-4441-b2e2-13d3d2a31100-metrics-tls\") pod \"dns-operator-744455d44c-qvxnd\" (UID: \"9656e5b3-4b38-4441-b2e2-13d3d2a31100\") " pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738395 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnbjh\" (UniqueName: \"kubernetes.io/projected/9656e5b3-4b38-4441-b2e2-13d3d2a31100-kube-api-access-pnbjh\") pod \"dns-operator-744455d44c-qvxnd\" (UID: \"9656e5b3-4b38-4441-b2e2-13d3d2a31100\") " pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738410 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-429zf\" (UniqueName: \"kubernetes.io/projected/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-kube-api-access-429zf\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738426 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-plugins-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738440 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8255876c-990a-4658-8d74-66b4d45e379c-srv-cert\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738453 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-stats-auth\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738478 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9n85\" (UniqueName: \"kubernetes.io/projected/89ec45a7-04a1-4913-b5e0-9ebc1d04f46c-kube-api-access-m9n85\") pod \"package-server-manager-789f6589d5-46x82\" (UID: \"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738495 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-metrics-certs\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738525 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpr4\" (UniqueName: \"kubernetes.io/projected/5b944b6c-4c73-4527-8d67-b9a2a4004606-kube-api-access-dfpr4\") pod \"multus-admission-controller-857f4d67dd-67brg\" (UID: \"5b944b6c-4c73-4527-8d67-b9a2a4004606\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/89ec45a7-04a1-4913-b5e0-9ebc1d04f46c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-46x82\" (UID: \"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738557 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6af3027-f0d8-4616-8622-b07d4b8f0b94-proxy-tls\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738574 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v548x\" (UniqueName: \"kubernetes.io/projected/f6af3027-f0d8-4616-8622-b07d4b8f0b94-kube-api-access-v548x\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738595 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35423004-ee05-40a4-9125-6b948b8371e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738611 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnnrk\" (UniqueName: \"kubernetes.io/projected/d7d9be6f-7415-4c1a-a6b9-6f5222d5580f-kube-api-access-pnnrk\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkdl5\" (UID: \"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738627 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-config\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738642 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/817177d4-0bf1-4fa4-babc-86b3bc19af23-serving-cert\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738658 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-csi-data-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738674 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5b944b6c-4c73-4527-8d67-b9a2a4004606-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-67brg\" (UID: \"5b944b6c-4c73-4527-8d67-b9a2a4004606\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738690 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b43916b9-e413-4a80-880a-3feee7227ec5-tmpfs\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738710 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-proxy-tls\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738729 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-config\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738746 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35423004-ee05-40a4-9125-6b948b8371e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738762 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-serving-cert\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf29w\" (UniqueName: \"kubernetes.io/projected/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-kube-api-access-wf29w\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738791 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6af3027-f0d8-4616-8622-b07d4b8f0b94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738807 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9768b475-a5b9-407b-b87b-b2a7b5a90260-node-bootstrap-token\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738822 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrkg\" (UniqueName: \"kubernetes.io/projected/61ccfc6e-26f3-473d-9a65-40ca39aafca2-kube-api-access-xlrkg\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738837 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-default-certificate\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738853 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1759d33c-e10d-4db8-91aa-60a30ff68255-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khsbr\" (UniqueName: \"kubernetes.io/projected/817177d4-0bf1-4fa4-babc-86b3bc19af23-kube-api-access-khsbr\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738886 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgrm\" (UniqueName: \"kubernetes.io/projected/eb24541e-e924-4085-b91e-e2d5a0bc8349-kube-api-access-xfgrm\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738901 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e922e0-65fb-449b-a90a-793b79189089-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738916 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e922e0-65fb-449b-a90a-793b79189089-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738947 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdv9\" (UniqueName: \"kubernetes.io/projected/3661c83f-42d5-4441-95c9-bc94757cc85d-kube-api-access-zrdv9\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738969 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvk4f\" (UniqueName: \"kubernetes.io/projected/b43916b9-e413-4a80-880a-3feee7227ec5-kube-api-access-nvk4f\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.738984 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739005 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06c65d1-809e-4e4f-922b-999763b9dd7a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739048 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1759d33c-e10d-4db8-91aa-60a30ff68255-ready\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739112 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-registration-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739134 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-signing-key\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739154 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1759d33c-e10d-4db8-91aa-60a30ff68255-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.739859 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1759d33c-e10d-4db8-91aa-60a30ff68255-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.741099 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/817177d4-0bf1-4fa4-babc-86b3bc19af23-config\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.747213 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-images\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.748945 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-signing-cabundle\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.749169 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-mountpoint-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.749737 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-ca\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.750142 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.756852 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9768b475-a5b9-407b-b87b-b2a7b5a90260-node-bootstrap-token\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.760572 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-plugins-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.764142 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35423004-ee05-40a4-9125-6b948b8371e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.764492 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.773057 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d06c65d1-809e-4e4f-922b-999763b9dd7a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.773419 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-config\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.774170 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-config\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.774247 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-csi-data-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.775157 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e922e0-65fb-449b-a90a-793b79189089-config\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.778505 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-service-ca\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.779225 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61ccfc6e-26f3-473d-9a65-40ca39aafca2-service-ca-bundle\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.779754 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-config-volume\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.779813 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-socket-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: E0105 13:49:53.780031 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.280019173 +0000 UTC m=+43.586927752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.780637 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1759d33c-e10d-4db8-91aa-60a30ff68255-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.781232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d06c65d1-809e-4e4f-922b-999763b9dd7a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.781638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1759d33c-e10d-4db8-91aa-60a30ff68255-ready\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.781691 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3661c83f-42d5-4441-95c9-bc94757cc85d-registration-dir\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.782230 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b43916b9-e413-4a80-880a-3feee7227ec5-tmpfs\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.787547 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9656e5b3-4b38-4441-b2e2-13d3d2a31100-metrics-tls\") pod \"dns-operator-744455d44c-qvxnd\" (UID: \"9656e5b3-4b38-4441-b2e2-13d3d2a31100\") " pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.787967 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-metrics-certs\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.792150 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-default-certificate\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.792815 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5b944b6c-4c73-4527-8d67-b9a2a4004606-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-67brg\" (UID: \"5b944b6c-4c73-4527-8d67-b9a2a4004606\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.793892 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.794049 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6af3027-f0d8-4616-8622-b07d4b8f0b94-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.806756 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-serving-cert\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.807408 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.807567 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7d9be6f-7415-4c1a-a6b9-6f5222d5580f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkdl5\" (UID: \"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.809576 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35423004-ee05-40a4-9125-6b948b8371e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.810436 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9n85\" (UniqueName: \"kubernetes.io/projected/89ec45a7-04a1-4913-b5e0-9ebc1d04f46c-kube-api-access-m9n85\") pod \"package-server-manager-789f6589d5-46x82\" (UID: \"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.810701 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b43916b9-e413-4a80-880a-3feee7227ec5-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.810985 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-proxy-tls\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.811112 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e922e0-65fb-449b-a90a-793b79189089-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.811127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-signing-key\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.811420 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.812502 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb24541e-e924-4085-b91e-e2d5a0bc8349-profile-collector-cert\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.812815 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/817177d4-0bf1-4fa4-babc-86b3bc19af23-serving-cert\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.818518 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-etcd-client\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.823322 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb24541e-e924-4085-b91e-e2d5a0bc8349-srv-cert\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.824689 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8111c300-7d33-4696-a1f1-cb7e5ebfaa87-cert\") pod \"ingress-canary-qxndq\" (UID: \"8111c300-7d33-4696-a1f1-cb7e5ebfaa87\") " pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.825306 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8255876c-990a-4658-8d74-66b4d45e379c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.825460 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b43916b9-e413-4a80-880a-3feee7227ec5-webhook-cert\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.825460 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9768b475-a5b9-407b-b87b-b2a7b5a90260-certs\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.825591 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-secret-volume\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.825696 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/89ec45a7-04a1-4913-b5e0-9ebc1d04f46c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-46x82\" (UID: \"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.825834 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6af3027-f0d8-4616-8622-b07d4b8f0b94-proxy-tls\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.827362 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/61ccfc6e-26f3-473d-9a65-40ca39aafca2-stats-auth\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.829779 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8255876c-990a-4658-8d74-66b4d45e379c-srv-cert\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.839588 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:53 crc kubenswrapper[4740]: E0105 13:49:53.840040 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.340025815 +0000 UTC m=+43.646934394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.845760 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt2vz\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-kube-api-access-xt2vz\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.846025 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee74f549-8b8c-4bc6-b5c2-b2be35ad0697-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vc6s7\" (UID: \"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.846362 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnn4j\" (UniqueName: \"kubernetes.io/projected/c9654c8c-9b4e-4f40-8226-ada3fccc7b28-kube-api-access-lnn4j\") pod \"dns-default-ppzj9\" (UID: \"c9654c8c-9b4e-4f40-8226-ada3fccc7b28\") " pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.854511 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfcw\" (UniqueName: \"kubernetes.io/projected/1759d33c-e10d-4db8-91aa-60a30ff68255-kube-api-access-cvfcw\") pod \"cni-sysctl-allowlist-ds-lg5bc\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.887900 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7wg5\" (UniqueName: \"kubernetes.io/projected/9768b475-a5b9-407b-b87b-b2a7b5a90260-kube-api-access-d7wg5\") pod \"machine-config-server-g56kn\" (UID: \"9768b475-a5b9-407b-b87b-b2a7b5a90260\") " pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.891653 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpr4\" (UniqueName: \"kubernetes.io/projected/5b944b6c-4c73-4527-8d67-b9a2a4004606-kube-api-access-dfpr4\") pod \"multus-admission-controller-857f4d67dd-67brg\" (UID: \"5b944b6c-4c73-4527-8d67-b9a2a4004606\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.901468 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.921316 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9w64\" (UniqueName: \"kubernetes.io/projected/8111c300-7d33-4696-a1f1-cb7e5ebfaa87-kube-api-access-h9w64\") pod \"ingress-canary-qxndq\" (UID: \"8111c300-7d33-4696-a1f1-cb7e5ebfaa87\") " pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.937986 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35423004-ee05-40a4-9125-6b948b8371e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ngp5g\" (UID: \"35423004-ee05-40a4-9125-6b948b8371e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.947280 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:53 crc kubenswrapper[4740]: E0105 13:49:53.947643 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.447632075 +0000 UTC m=+43.754540654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.956740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5cn\" (UniqueName: \"kubernetes.io/projected/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-kube-api-access-cj5cn\") pod \"collect-profiles-29460345-f27lw\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.965004 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4tj\" (UniqueName: \"kubernetes.io/projected/b9326591-f868-4939-9806-a5f09e56a0d0-kube-api-access-dw4tj\") pod \"migrator-59844c95c7-lwmk6\" (UID: \"b9326591-f868-4939-9806-a5f09e56a0d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.972143 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.983954 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmph\" (UniqueName: \"kubernetes.io/projected/80bfed68-6820-4458-aa8a-779cc3120e43-kube-api-access-2mmph\") pod \"marketplace-operator-79b997595-45gg9\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:53 crc kubenswrapper[4740]: I0105 13:49:53.988741 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.001355 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qxndq" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.004187 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-429zf\" (UniqueName: \"kubernetes.io/projected/9a8e3cda-8a5c-4308-85d5-c8d2abe334f7-kube-api-access-429zf\") pod \"machine-config-operator-74547568cd-zpngz\" (UID: \"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.006366 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g56kn" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.021786 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnbjh\" (UniqueName: \"kubernetes.io/projected/9656e5b3-4b38-4441-b2e2-13d3d2a31100-kube-api-access-pnbjh\") pod \"dns-operator-744455d44c-qvxnd\" (UID: \"9656e5b3-4b38-4441-b2e2-13d3d2a31100\") " pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.044728 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v548x\" (UniqueName: \"kubernetes.io/projected/f6af3027-f0d8-4616-8622-b07d4b8f0b94-kube-api-access-v548x\") pod \"machine-config-controller-84d6567774-lt6sv\" (UID: \"f6af3027-f0d8-4616-8622-b07d4b8f0b94\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.050543 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.050725 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.550697148 +0000 UTC m=+43.857605727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.050877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.051320 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.551311703 +0000 UTC m=+43.858220282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.059698 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnnrk\" (UniqueName: \"kubernetes.io/projected/d7d9be6f-7415-4c1a-a6b9-6f5222d5580f-kube-api-access-pnnrk\") pod \"control-plane-machine-set-operator-78cbb6b69f-kkdl5\" (UID: \"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.062387 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.084405 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrkg\" (UniqueName: \"kubernetes.io/projected/61ccfc6e-26f3-473d-9a65-40ca39aafca2-kube-api-access-xlrkg\") pod \"router-default-5444994796-f8vfq\" (UID: \"61ccfc6e-26f3-473d-9a65-40ca39aafca2\") " pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.092059 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.101325 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.108898 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.124022 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4h49\" (UniqueName: \"kubernetes.io/projected/d06c65d1-809e-4e4f-922b-999763b9dd7a-kube-api-access-g4h49\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zhjv\" (UID: \"d06c65d1-809e-4e4f-922b-999763b9dd7a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.124192 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.132801 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.134271 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8l6d\" (UniqueName: \"kubernetes.io/projected/8255876c-990a-4658-8d74-66b4d45e379c-kube-api-access-c8l6d\") pod \"olm-operator-6b444d44fb-qhb46\" (UID: \"8255876c-990a-4658-8d74-66b4d45e379c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.137352 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.144723 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.169432 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.169812 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.170200 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.170414 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.170794 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.670779453 +0000 UTC m=+43.977688032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.172711 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms5xt\" (UniqueName: \"kubernetes.io/projected/b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd-kube-api-access-ms5xt\") pod \"service-ca-9c57cc56f-5676h\" (UID: \"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd\") " pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.179424 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.184143 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.193441 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.196665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgrm\" (UniqueName: \"kubernetes.io/projected/eb24541e-e924-4085-b91e-e2d5a0bc8349-kube-api-access-xfgrm\") pod \"catalog-operator-68c6474976-xvzzm\" (UID: \"eb24541e-e924-4085-b91e-e2d5a0bc8349\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.196987 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e922e0-65fb-449b-a90a-793b79189089-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-59r7l\" (UID: \"e5e922e0-65fb-449b-a90a-793b79189089\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.198600 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khsbr\" (UniqueName: \"kubernetes.io/projected/817177d4-0bf1-4fa4-babc-86b3bc19af23-kube-api-access-khsbr\") pod \"service-ca-operator-777779d784-dztht\" (UID: \"817177d4-0bf1-4fa4-babc-86b3bc19af23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.201885 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r"] Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.209140 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.217578 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvk4f\" (UniqueName: \"kubernetes.io/projected/b43916b9-e413-4a80-880a-3feee7227ec5-kube-api-access-nvk4f\") pod \"packageserver-d55dfcdfc-tzhlt\" (UID: \"b43916b9-e413-4a80-880a-3feee7227ec5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.235720 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdv9\" (UniqueName: \"kubernetes.io/projected/3661c83f-42d5-4441-95c9-bc94757cc85d-kube-api-access-zrdv9\") pod \"csi-hostpathplugin-pcbgg\" (UID: \"3661c83f-42d5-4441-95c9-bc94757cc85d\") " pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.247508 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.248814 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5676h" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.277964 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.278451 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.778440084 +0000 UTC m=+44.085348663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.278846 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.282142 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf29w\" (UniqueName: \"kubernetes.io/projected/f2c9c9bf-e6ce-4deb-b0d2-0669949b739e-kube-api-access-wf29w\") pod \"etcd-operator-b45778765-tbzlt\" (UID: \"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.378703 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.378999 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.878982721 +0000 UTC m=+44.185891300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.381389 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" event={"ID":"a03ac734-fdb2-4390-a5dc-1aed999390b4","Type":"ContainerStarted","Data":"28c9fa655953c9d778e4633ec3f26139bb629df4e556bf86092fec0346f182d7"} Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.385854 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" event={"ID":"1759d33c-e10d-4db8-91aa-60a30ff68255","Type":"ContainerStarted","Data":"393bd6d65b0a337b85cff9277b1fa9a0b6714ed5cdb7d84ec7da17e7f021db9a"} Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.388265 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82"] Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.407762 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" event={"ID":"8707fead-c468-4d30-8966-238dee410c47","Type":"ContainerStarted","Data":"adade5de694f6ed33b7d8f9d719e14cabdac7dcc4ca77500fc1e90a3400df825"} Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.412958 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.417303 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.432974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" event={"ID":"3087a0cb-777e-4c57-b376-aa006b7853c1","Type":"ContainerStarted","Data":"a4b56e5337e66860fee91f25eee5888809c894593c143b8513e17543b2454008"} Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.434115 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.448512 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g56kn" event={"ID":"9768b475-a5b9-407b-b87b-b2a7b5a90260","Type":"ContainerStarted","Data":"5d831440b54a0d5953c4e0a064806c90d2710764d3c6720d7a378e8a65f7bd07"} Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.452938 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-tjvv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.453004 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tjvv9" podUID="f6a82703-5aea-4d1a-a8fa-3e4393a1176b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.481648 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.482055 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:54.982042664 +0000 UTC m=+44.288951243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.482501 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.531744 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.535687 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.582328 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.584044 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.084025228 +0000 UTC m=+44.390933807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.686293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.686761 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.186749781 +0000 UTC m=+44.493658360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.758926 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lr6c7" podStartSLOduration=15.75890823 podStartE2EDuration="15.75890823s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:54.755788909 +0000 UTC m=+44.062697488" watchObservedRunningTime="2026-01-05 13:49:54.75890823 +0000 UTC m=+44.065816809" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.759136 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qxndq"] Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.787672 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.787940 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.287924954 +0000 UTC m=+44.594833533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.788025 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.788350 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.288342605 +0000 UTC m=+44.595251184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.835739 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz"] Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.869819 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-qj9kj" podStartSLOduration=15.869803635 podStartE2EDuration="15.869803635s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:54.869634791 +0000 UTC m=+44.176543370" watchObservedRunningTime="2026-01-05 13:49:54.869803635 +0000 UTC m=+44.176712214" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.893179 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.899114 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.399084148 +0000 UTC m=+44.705992727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.961927 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tjvv9" podStartSLOduration=15.961906472 podStartE2EDuration="15.961906472s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:54.930213148 +0000 UTC m=+44.237121727" watchObservedRunningTime="2026-01-05 13:49:54.961906472 +0000 UTC m=+44.268815051" Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.977594 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45gg9"] Jan 05 13:49:54 crc kubenswrapper[4740]: I0105 13:49:54.998931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:54 crc kubenswrapper[4740]: E0105 13:49:54.999386 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.499369837 +0000 UTC m=+44.806278416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: W0105 13:49:55.064027 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8e3cda_8a5c_4308_85d5_c8d2abe334f7.slice/crio-40af162ae9148cd076ba50a8c8335ecbdb89806d3f1ec84a365dc1782499c5f0 WatchSource:0}: Error finding container 40af162ae9148cd076ba50a8c8335ecbdb89806d3f1ec84a365dc1782499c5f0: Status 404 returned error can't find the container with id 40af162ae9148cd076ba50a8c8335ecbdb89806d3f1ec84a365dc1782499c5f0 Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.079092 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" podStartSLOduration=16.079053121 podStartE2EDuration="16.079053121s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.077575392 +0000 UTC m=+44.384483971" watchObservedRunningTime="2026-01-05 13:49:55.079053121 +0000 UTC m=+44.385961700" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.102090 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.102344 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.104298 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.604272198 +0000 UTC m=+44.911180777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.148268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b83a0bd7-6f44-4045-bb8c-e80f10959714-metrics-certs\") pod \"network-metrics-daemon-5qztt\" (UID: \"b83a0bd7-6f44-4045-bb8c-e80f10959714\") " pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.205489 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.206024 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.706011385 +0000 UTC m=+45.012919964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.291263 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podStartSLOduration=16.291247784 podStartE2EDuration="16.291247784s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.290402491 +0000 UTC m=+44.597311090" watchObservedRunningTime="2026-01-05 13:49:55.291247784 +0000 UTC m=+44.598156363" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.293049 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5qztt" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.309256 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.309815 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.809799776 +0000 UTC m=+45.116708355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.413727 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.414048 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:55.91403726 +0000 UTC m=+45.220945839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.423610 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k4d99" podStartSLOduration=16.423593778 podStartE2EDuration="16.423593778s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.380182258 +0000 UTC m=+44.687090837" watchObservedRunningTime="2026-01-05 13:49:55.423593778 +0000 UTC m=+44.730502357" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.453928 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" podStartSLOduration=16.453911268 podStartE2EDuration="16.453911268s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.453513407 +0000 UTC m=+44.760421986" watchObservedRunningTime="2026-01-05 13:49:55.453911268 +0000 UTC m=+44.760819847" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.465264 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" event={"ID":"80bfed68-6820-4458-aa8a-779cc3120e43","Type":"ContainerStarted","Data":"854fa3fc7d86c381b0bfa208811c3eea7363f098b14079329c33da98b6335991"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.466773 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f8vfq" event={"ID":"61ccfc6e-26f3-473d-9a65-40ca39aafca2","Type":"ContainerStarted","Data":"14d1a0dd9b47a01c1c3209c3aef94dfbcb5c477c7bd10a123ffa701476b8da1d"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.466797 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f8vfq" event={"ID":"61ccfc6e-26f3-473d-9a65-40ca39aafca2","Type":"ContainerStarted","Data":"e531b45c939adbd22e8284edbe97e6c0c28f3ec194b8676d3d9d51693f91b150"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.492809 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" event={"ID":"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c","Type":"ContainerStarted","Data":"2d028a727d0b36755c23cd2270a08de8feb22b8e2e48d2b91c33522f3923133b"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.492861 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" event={"ID":"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c","Type":"ContainerStarted","Data":"e07107710351b9b0bb55578ae39f1669a6f04d83774d64ce9507e738888dca72"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.509839 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g56kn" event={"ID":"9768b475-a5b9-407b-b87b-b2a7b5a90260","Type":"ContainerStarted","Data":"5f29b5e3433533ada3c2fc9fd765c3296f5773f383d9b5a04d16bf6647e1f0bc"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.514927 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.515028 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.015013368 +0000 UTC m=+45.321921947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.515251 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" event={"ID":"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7","Type":"ContainerStarted","Data":"40af162ae9148cd076ba50a8c8335ecbdb89806d3f1ec84a365dc1782499c5f0"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.515457 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.515699 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.015692385 +0000 UTC m=+45.322600964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.527882 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" event={"ID":"f27a472e-ea53-41f1-9360-84b2f9f0fd36","Type":"ContainerStarted","Data":"3d118459a056f77cbbeda320e116e86f6ac6f10af9be948fc9769134b7a7cf69"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.527932 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" event={"ID":"f27a472e-ea53-41f1-9360-84b2f9f0fd36","Type":"ContainerStarted","Data":"a01a0cd18108f035ca9932878c7714ae6abb027b3bf22778b8f42d84d311fcf5"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.533817 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" event={"ID":"8707fead-c468-4d30-8966-238dee410c47","Type":"ContainerStarted","Data":"83b77905bae7dd67639159379224b78ba6557eb412e5f33db9181942f644b594"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.547104 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qxndq" event={"ID":"8111c300-7d33-4696-a1f1-cb7e5ebfaa87","Type":"ContainerStarted","Data":"192085512ae03a579f0269a013bca38037bb6c1ddfa84140666f7728e67974c0"} Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.549407 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-tjvv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.549449 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tjvv9" podUID="f6a82703-5aea-4d1a-a8fa-3e4393a1176b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.620417 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.624393 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.124370104 +0000 UTC m=+45.431278683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.686993 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g"] Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.691879 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l"] Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.722182 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.723617 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.223606766 +0000 UTC m=+45.530515345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.738184 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jd92l" podStartSLOduration=16.738168715 podStartE2EDuration="16.738168715s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.736597615 +0000 UTC m=+45.043506194" watchObservedRunningTime="2026-01-05 13:49:55.738168715 +0000 UTC m=+45.045077294" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.775567 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" podStartSLOduration=16.775551659 podStartE2EDuration="16.775551659s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.774666795 +0000 UTC m=+45.081575374" watchObservedRunningTime="2026-01-05 13:49:55.775551659 +0000 UTC m=+45.082460238" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.823612 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.823984 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.323957038 +0000 UTC m=+45.630865617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.824045 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.824412 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.32439989 +0000 UTC m=+45.631308469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.896875 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-r7n7z" podStartSLOduration=16.896860386 podStartE2EDuration="16.896860386s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.895972433 +0000 UTC m=+45.202881012" watchObservedRunningTime="2026-01-05 13:49:55.896860386 +0000 UTC m=+45.203768965" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.926623 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:55 crc kubenswrapper[4740]: E0105 13:49:55.926977 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.426962709 +0000 UTC m=+45.733871288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.950659 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" podStartSLOduration=16.950643695 podStartE2EDuration="16.950643695s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.950159503 +0000 UTC m=+45.257068082" watchObservedRunningTime="2026-01-05 13:49:55.950643695 +0000 UTC m=+45.257552274" Jan 05 13:49:55 crc kubenswrapper[4740]: I0105 13:49:55.976818 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" podStartSLOduration=16.976805206 podStartE2EDuration="16.976805206s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:55.975488462 +0000 UTC m=+45.282397041" watchObservedRunningTime="2026-01-05 13:49:55.976805206 +0000 UTC m=+45.283713785" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.027781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.028181 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.528168473 +0000 UTC m=+45.835077052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.130494 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.130643 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.630617919 +0000 UTC m=+45.937526498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.130890 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.131164 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.631158083 +0000 UTC m=+45.938066662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.133574 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.143477 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:49:56 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:49:56 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:49:56 crc kubenswrapper[4740]: healthz check failed Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.143526 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.236461 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.236971 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7v62c" podStartSLOduration=17.236957766 podStartE2EDuration="17.236957766s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.234462581 +0000 UTC m=+45.541371160" watchObservedRunningTime="2026-01-05 13:49:56.236957766 +0000 UTC m=+45.543866345" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.237114 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.73709689 +0000 UTC m=+46.044005469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.265777 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h2txk" podStartSLOduration=17.265759957 podStartE2EDuration="17.265759957s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.26088752 +0000 UTC m=+45.567796099" watchObservedRunningTime="2026-01-05 13:49:56.265759957 +0000 UTC m=+45.572668536" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.312235 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" podStartSLOduration=17.312220025 podStartE2EDuration="17.312220025s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.306511807 +0000 UTC m=+45.613420386" watchObservedRunningTime="2026-01-05 13:49:56.312220025 +0000 UTC m=+45.619128604" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.343750 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.344023 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.844012262 +0000 UTC m=+46.150920841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.368191 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.447556 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.447650 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.947634469 +0000 UTC m=+46.254543048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.448023 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.448315 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:56.948308598 +0000 UTC m=+46.255217177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.491007 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g56kn" podStartSLOduration=5.490991388 podStartE2EDuration="5.490991388s" podCreationTimestamp="2026-01-05 13:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.459813127 +0000 UTC m=+45.766721696" watchObservedRunningTime="2026-01-05 13:49:56.490991388 +0000 UTC m=+45.797899967" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.493182 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qvxnd"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.511503 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" podStartSLOduration=17.511485931 podStartE2EDuration="17.511485931s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.510849485 +0000 UTC m=+45.817758064" watchObservedRunningTime="2026-01-05 13:49:56.511485931 +0000 UTC m=+45.818394510" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.528152 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.530253 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.538729 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.538769 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.550801 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.551096 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.051080522 +0000 UTC m=+46.357989101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.551582 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f8vfq" podStartSLOduration=17.551566595 podStartE2EDuration="17.551566595s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.542865278 +0000 UTC m=+45.849773857" watchObservedRunningTime="2026-01-05 13:49:56.551566595 +0000 UTC m=+45.858475174" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.563578 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" event={"ID":"35423004-ee05-40a4-9125-6b948b8371e8","Type":"ContainerStarted","Data":"782dfc8715ca5174b5ef78503bf9fc77ed5e15d92ea88aafa10b6f55b2b48c0e"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.563614 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" event={"ID":"35423004-ee05-40a4-9125-6b948b8371e8","Type":"ContainerStarted","Data":"4bcb22374fdff6ab713e4adbddf98c5e5d7b5dc71dadb618160a00768a61f45b"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.572273 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5676h"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.573232 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qxndq" event={"ID":"8111c300-7d33-4696-a1f1-cb7e5ebfaa87","Type":"ContainerStarted","Data":"d79a1b3595abf2736d43ef3b5449eaf6d4983fee40c83d6812b077e1a8b847ff"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.573706 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-67brg"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.575365 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.583715 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ngp5g" podStartSLOduration=17.583701101 podStartE2EDuration="17.583701101s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.581829282 +0000 UTC m=+45.888737861" watchObservedRunningTime="2026-01-05 13:49:56.583701101 +0000 UTC m=+45.890609680" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.583796 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" event={"ID":"1759d33c-e10d-4db8-91aa-60a30ff68255","Type":"ContainerStarted","Data":"a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.583822 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.592654 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ppzj9"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.593761 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.596144 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" event={"ID":"9656e5b3-4b38-4441-b2e2-13d3d2a31100","Type":"ContainerStarted","Data":"c637e41025d763d4cfcddfddc8dbcd59afeb550cb9f616ce015a23093e2ffed9"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.607800 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" event={"ID":"89ec45a7-04a1-4913-b5e0-9ebc1d04f46c","Type":"ContainerStarted","Data":"1c595427d586a8a07ed559b0c5b739acdc04c9a4e38ed710585a5acc79d5f1e6"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.608596 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.613133 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qxndq" podStartSLOduration=5.613117507 podStartE2EDuration="5.613117507s" podCreationTimestamp="2026-01-05 13:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.610780736 +0000 UTC m=+45.917689315" watchObservedRunningTime="2026-01-05 13:49:56.613117507 +0000 UTC m=+45.920026086" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.614884 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.620726 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" event={"ID":"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7","Type":"ContainerStarted","Data":"ecc7d2c1cfc77cd1402b09b095c4285b3ec4fcd17037397d8f36610599f578dd"} Jan 05 13:49:56 crc kubenswrapper[4740]: W0105 13:49:56.620729 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06c65d1_809e_4e4f_922b_999763b9dd7a.slice/crio-d41722a3e30617ec32e2b435daa30b2d172c358a36bfb69d6645e1a5f9c749a3 WatchSource:0}: Error finding container d41722a3e30617ec32e2b435daa30b2d172c358a36bfb69d6645e1a5f9c749a3: Status 404 returned error can't find the container with id d41722a3e30617ec32e2b435daa30b2d172c358a36bfb69d6645e1a5f9c749a3 Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.620763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" event={"ID":"9a8e3cda-8a5c-4308-85d5-c8d2abe334f7","Type":"ContainerStarted","Data":"1e62bdd358f65e3c840143c9a78a17c4a78910f00f3d241f25a9bbdf5b638237"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.644192 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.644753 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.653426 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.653828 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.153814546 +0000 UTC m=+46.460723125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.667790 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" event={"ID":"f27a472e-ea53-41f1-9360-84b2f9f0fd36","Type":"ContainerStarted","Data":"d136afb30e8aae24e1e8d6c3d21d27ffacb44ec9ee7ede3dcbfde6a52117f773"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.670167 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" podStartSLOduration=17.670147951 podStartE2EDuration="17.670147951s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.654558795 +0000 UTC m=+45.961467374" watchObservedRunningTime="2026-01-05 13:49:56.670147951 +0000 UTC m=+45.977056520" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.679635 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.680903 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" event={"ID":"e5e922e0-65fb-449b-a90a-793b79189089","Type":"ContainerStarted","Data":"5592b8cc1fc0672b0d3c89e5ce95a189e62f37ef5f93c1b7e39e492562d9455b"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.680941 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" event={"ID":"e5e922e0-65fb-449b-a90a-793b79189089","Type":"ContainerStarted","Data":"6937c7f5c4258ea53c0e564d5d4a40367a159aa4ef796cbdffe55858dd20ad86"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.695007 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" podStartSLOduration=6.6949885380000005 podStartE2EDuration="6.694988538s" podCreationTimestamp="2026-01-05 13:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.687265746 +0000 UTC m=+45.994174335" watchObservedRunningTime="2026-01-05 13:49:56.694988538 +0000 UTC m=+46.001897117" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.706672 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" event={"ID":"80bfed68-6820-4458-aa8a-779cc3120e43","Type":"ContainerStarted","Data":"4724174628abc536dad3149a35513f30b96fed7b55a35373eb785017010ecf8f"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.707463 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.708673 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.708840 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.730652 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.748999 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-45gg9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.749040 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" podUID="80bfed68-6820-4458-aa8a-779cc3120e43" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.752638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" event={"ID":"b43916b9-e413-4a80-880a-3feee7227ec5","Type":"ContainerStarted","Data":"d45e3be57f38ed431c57e4e3a1fcc48f6d4dfb6a02a90d63fed14cee7010787c"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.752665 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" event={"ID":"b43916b9-e413-4a80-880a-3feee7227ec5","Type":"ContainerStarted","Data":"89e17e8c4d0c11646f8b21ab0fd8b1796565f6dfb8657a57035bb197a938e5b7"} Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.752679 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.754119 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.756621 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.256597201 +0000 UTC m=+46.563505780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.756619 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-59r7l" podStartSLOduration=17.756599381 podStartE2EDuration="17.756599381s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.74391183 +0000 UTC m=+46.050820399" watchObservedRunningTime="2026-01-05 13:49:56.756599381 +0000 UTC m=+46.063507960" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.761203 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tbzlt"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.761239 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pcbgg"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.761980 4740 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzhlt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.762022 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" podUID="b43916b9-e413-4a80-880a-3feee7227ec5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.763421 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5qztt"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.764730 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.765470 4740 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jvn7z container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]log ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]etcd ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/generic-apiserver-start-informers ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/max-in-flight-filter ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 05 13:49:56 crc kubenswrapper[4740]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 05 13:49:56 crc kubenswrapper[4740]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/project.openshift.io-projectcache ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/openshift.io-startinformers ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 05 13:49:56 crc kubenswrapper[4740]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 05 13:49:56 crc kubenswrapper[4740]: livez check failed Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.765541 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" podUID="8707fead-c468-4d30-8966-238dee410c47" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.795026 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dztht"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.810000 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm"] Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.820351 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zpngz" podStartSLOduration=17.82033451 podStartE2EDuration="17.82033451s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.818241045 +0000 UTC m=+46.125149644" watchObservedRunningTime="2026-01-05 13:49:56.82033451 +0000 UTC m=+46.127243089" Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.855903 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.858542 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.358527973 +0000 UTC m=+46.665436552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.863386 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-75w2r" podStartSLOduration=17.86337383 podStartE2EDuration="17.86337383s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.859047077 +0000 UTC m=+46.165955656" watchObservedRunningTime="2026-01-05 13:49:56.86337383 +0000 UTC m=+46.170282409" Jan 05 13:49:56 crc kubenswrapper[4740]: W0105 13:49:56.879408 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c9c9bf_e6ce_4deb_b0d2_0669949b739e.slice/crio-5ac369404ce18467f1dfd18d78928db68370a5d910a0f0f0ec0fec251d2ae65e WatchSource:0}: Error finding container 5ac369404ce18467f1dfd18d78928db68370a5d910a0f0f0ec0fec251d2ae65e: Status 404 returned error can't find the container with id 5ac369404ce18467f1dfd18d78928db68370a5d910a0f0f0ec0fec251d2ae65e Jan 05 13:49:56 crc kubenswrapper[4740]: I0105 13:49:56.958323 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:56 crc kubenswrapper[4740]: E0105 13:49:56.958662 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.458646469 +0000 UTC m=+46.765555048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.021193 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" podStartSLOduration=18.021175686 podStartE2EDuration="18.021175686s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:56.99709607 +0000 UTC m=+46.304004669" watchObservedRunningTime="2026-01-05 13:49:57.021175686 +0000 UTC m=+46.328084265" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.059749 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.060056 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.560043758 +0000 UTC m=+46.866952337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.138606 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:49:57 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:49:57 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:49:57 crc kubenswrapper[4740]: healthz check failed Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.138863 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.161694 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.162042 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.662028022 +0000 UTC m=+46.968936601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.263952 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.264267 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.764256963 +0000 UTC m=+47.071165532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.315630 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" podStartSLOduration=18.315607089 podStartE2EDuration="18.315607089s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.022303636 +0000 UTC m=+46.329212215" watchObservedRunningTime="2026-01-05 13:49:57.315607089 +0000 UTC m=+46.622515668" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.318031 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lg5bc"] Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.364649 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.365009 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.864993775 +0000 UTC m=+47.171902354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.466558 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.466924 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:57.966908587 +0000 UTC m=+47.273817166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.564371 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jkzsn" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.567555 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.567788 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.067758732 +0000 UTC m=+47.374667311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.567999 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.568303 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.068291416 +0000 UTC m=+47.375199995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.670656 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.670811 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.170788593 +0000 UTC m=+47.477697162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.670969 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.671216 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.171204974 +0000 UTC m=+47.478113553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.772642 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.772904 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.272884391 +0000 UTC m=+47.579792970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.773097 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.773396 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.273387174 +0000 UTC m=+47.580295753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.774345 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ppzj9" event={"ID":"c9654c8c-9b4e-4f40-8226-ada3fccc7b28","Type":"ContainerStarted","Data":"9c12b26a4d3d34036bc6dfbe65ff5dbbf37a344f8c2a321b86da0f4a30e1cacd"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.774381 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ppzj9" event={"ID":"c9654c8c-9b4e-4f40-8226-ada3fccc7b28","Type":"ContainerStarted","Data":"9a6dbdc8b6aae3d950d3c70db02031b7fbc82071af87d31fe14556c31c086a6f"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.784121 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" event={"ID":"b9326591-f868-4939-9806-a5f09e56a0d0","Type":"ContainerStarted","Data":"5e50839ae13e809093a5960b38a522559c852ab0e6931870f22a23b847ee8cfb"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.784165 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" event={"ID":"b9326591-f868-4939-9806-a5f09e56a0d0","Type":"ContainerStarted","Data":"0f53770a36b5e13d5277c661bd785a405e9ab1ee5e1449978ad4455354ac63a2"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.784174 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" event={"ID":"b9326591-f868-4939-9806-a5f09e56a0d0","Type":"ContainerStarted","Data":"86ba87fec119538e44cecf0bfd2b65e9ceff4a25f6256fe3a2d5a758a4bf0d48"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.803128 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5676h" event={"ID":"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd","Type":"ContainerStarted","Data":"29b8085edee87e2f86fbe6ad54ceb79c0f07b90b3e5c842a98eebed0204a834d"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.803165 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5676h" event={"ID":"b1a69ff2-2e9c-46b6-b3bb-9f5c08dfadcd","Type":"ContainerStarted","Data":"7ba9020d963db0b405d461ec7d21f572e4cea8a61770371a703cf579d5f3bb8d"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.819362 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5qztt" event={"ID":"b83a0bd7-6f44-4045-bb8c-e80f10959714","Type":"ContainerStarted","Data":"4240addda5e33327475dd768d8334fe9635f86e55aea9b201fd85342bf94ab73"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.819402 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5qztt" event={"ID":"b83a0bd7-6f44-4045-bb8c-e80f10959714","Type":"ContainerStarted","Data":"da89c2c140937ace0a719bafe9f746124b211c012d9aa8fe788ad749645ff9c9"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.829416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" event={"ID":"8255876c-990a-4658-8d74-66b4d45e379c","Type":"ContainerStarted","Data":"d1835304bd676c0361188708b1e63341a228807332ec98acce6d399482f12998"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.829476 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" event={"ID":"8255876c-990a-4658-8d74-66b4d45e379c","Type":"ContainerStarted","Data":"32fd0d279916ae83bb79f3da07d323a2cdf55f9b5ea378d90c9027b13677fe1b"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.830427 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.831495 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lwmk6" podStartSLOduration=18.831480775 podStartE2EDuration="18.831480775s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.802908482 +0000 UTC m=+47.109817061" watchObservedRunningTime="2026-01-05 13:49:57.831480775 +0000 UTC m=+47.138389354" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.833292 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" event={"ID":"f6af3027-f0d8-4616-8622-b07d4b8f0b94","Type":"ContainerStarted","Data":"742bc231442b630ddeccb01c4a20a4cc2313f887c456158300ac3f388d5d45b6"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.833323 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" event={"ID":"f6af3027-f0d8-4616-8622-b07d4b8f0b94","Type":"ContainerStarted","Data":"8e6b7e87993113f82fd9108582c3bdefee74e9c0f50b25ffcdc47f147f94f770"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.833333 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" event={"ID":"f6af3027-f0d8-4616-8622-b07d4b8f0b94","Type":"ContainerStarted","Data":"49d62d6eef7bb761207eead4c21ca3c2646cdaa1eca1fcf151b3d4463d2dfee1"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.841734 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" event={"ID":"3661c83f-42d5-4441-95c9-bc94757cc85d","Type":"ContainerStarted","Data":"be47c88b4e916033fc5168f0f4348ba2c6ad43147e58db32d3ab224423c2a3c4"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.843351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" event={"ID":"409599e0-5f32-4b72-9c6a-73c9d9d4cc63","Type":"ContainerStarted","Data":"6376a4ccdd7d44d6d9831a5b4572a41715dbcb7f2d9c609a98891f8118d7abee"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.843375 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" event={"ID":"409599e0-5f32-4b72-9c6a-73c9d9d4cc63","Type":"ContainerStarted","Data":"5e716e651f2ddd84d7825e9cbe10090854eead82a42bfced0a07cec444324464"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.850976 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.858114 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5676h" podStartSLOduration=18.858096058 podStartE2EDuration="18.858096058s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.830280704 +0000 UTC m=+47.137189293" watchObservedRunningTime="2026-01-05 13:49:57.858096058 +0000 UTC m=+47.165004647" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.873642 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" event={"ID":"9656e5b3-4b38-4441-b2e2-13d3d2a31100","Type":"ContainerStarted","Data":"f9dab4495e742e64815fe6bb6ec1c4e37b79228a1400400d88b5a2c1c2db96cf"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.877754 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" podStartSLOduration=18.877738099 podStartE2EDuration="18.877738099s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.86045942 +0000 UTC m=+47.167368019" watchObservedRunningTime="2026-01-05 13:49:57.877738099 +0000 UTC m=+47.184646678" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.883686 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.884011 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.383996162 +0000 UTC m=+47.690904741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.884772 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.885139 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.385132542 +0000 UTC m=+47.692041121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.886314 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" event={"ID":"817177d4-0bf1-4fa4-babc-86b3bc19af23","Type":"ContainerStarted","Data":"95f20e91509484703da9f04fd833c40a202a611678ea3564b72be09d43f8adc2"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.886488 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" event={"ID":"817177d4-0bf1-4fa4-babc-86b3bc19af23","Type":"ContainerStarted","Data":"2ae0d24782d5b266cddf13fd147b7b310aaa173372fa2c8673163fb6eff6ed48"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.898335 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.904921 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" event={"ID":"eb24541e-e924-4085-b91e-e2d5a0bc8349","Type":"ContainerStarted","Data":"2fc220aa671f652aa45c2774a1bee1e8574c42b21b00d51203829998146a314d"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.904968 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" event={"ID":"eb24541e-e924-4085-b91e-e2d5a0bc8349","Type":"ContainerStarted","Data":"84c26dffb09920ba420a2d88fc079ca373705e9f22705e2d4ff8a308936659a9"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.906350 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lt6sv" podStartSLOduration=18.906337464 podStartE2EDuration="18.906337464s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.904441275 +0000 UTC m=+47.211349854" watchObservedRunningTime="2026-01-05 13:49:57.906337464 +0000 UTC m=+47.213246043" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.907487 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" podStartSLOduration=18.907482154 podStartE2EDuration="18.907482154s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.879572018 +0000 UTC m=+47.186480597" watchObservedRunningTime="2026-01-05 13:49:57.907482154 +0000 UTC m=+47.214390733" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.907648 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.911271 4740 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvzzm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.911313 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" podUID="eb24541e-e924-4085-b91e-e2d5a0bc8349" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.913324 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.983318 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" event={"ID":"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e","Type":"ContainerStarted","Data":"a0d10433436a4e90508331fe80bfa30282af438f8aaf906e458df457162446eb"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.983361 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" event={"ID":"f2c9c9bf-e6ce-4deb-b0d2-0669949b739e","Type":"ContainerStarted","Data":"5ac369404ce18467f1dfd18d78928db68370a5d910a0f0f0ec0fec251d2ae65e"} Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.984996 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.98497717 podStartE2EDuration="984.97717ms" podCreationTimestamp="2026-01-05 13:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.983377879 +0000 UTC m=+47.290286458" watchObservedRunningTime="2026-01-05 13:49:57.98497717 +0000 UTC m=+47.291885749" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.985452 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" podStartSLOduration=18.985448832 podStartE2EDuration="18.985448832s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:57.957212677 +0000 UTC m=+47.264121256" watchObservedRunningTime="2026-01-05 13:49:57.985448832 +0000 UTC m=+47.292357411" Jan 05 13:49:57 crc kubenswrapper[4740]: I0105 13:49:57.986280 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:57 crc kubenswrapper[4740]: E0105 13:49:57.987209 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.487184148 +0000 UTC m=+47.794092727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.013709 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" event={"ID":"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697","Type":"ContainerStarted","Data":"f676edfeb7c8d48e38bc1637b5904ca70f742c7363977a6bd4a55df914acda8b"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.013753 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" event={"ID":"ee74f549-8b8c-4bc6-b5c2-b2be35ad0697","Type":"ContainerStarted","Data":"39e4abba8910d50632ec1d84ec49ec4c92e3b86fdcaedff22b48509df5866703"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.027571 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dztht" podStartSLOduration=19.027556278 podStartE2EDuration="19.027556278s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:58.013508663 +0000 UTC m=+47.320417242" watchObservedRunningTime="2026-01-05 13:49:58.027556278 +0000 UTC m=+47.334464857" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.048356 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" event={"ID":"d06c65d1-809e-4e4f-922b-999763b9dd7a","Type":"ContainerStarted","Data":"db32b3f1510d8808e0f344b365982893807dd63b5c4cef8194a6aa2e66d0dc15"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.048391 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" event={"ID":"d06c65d1-809e-4e4f-922b-999763b9dd7a","Type":"ContainerStarted","Data":"d41722a3e30617ec32e2b435daa30b2d172c358a36bfb69d6645e1a5f9c749a3"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.061628 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc6s7" podStartSLOduration=19.061613545 podStartE2EDuration="19.061613545s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:58.060915776 +0000 UTC m=+47.367824355" watchObservedRunningTime="2026-01-05 13:49:58.061613545 +0000 UTC m=+47.368522124" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.079424 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" event={"ID":"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f","Type":"ContainerStarted","Data":"2d81d67e1b648b7975472e9116008a876550e38046ed763ae7907ccf178e25f9"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.079463 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" event={"ID":"d7d9be6f-7415-4c1a-a6b9-6f5222d5580f","Type":"ContainerStarted","Data":"868e9a75212dcfb76e8b91de9e24f8330ffef0eb4d074c505726cc9e044874a0"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.079582 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tbzlt" podStartSLOduration=19.079563842 podStartE2EDuration="19.079563842s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:58.037364774 +0000 UTC m=+47.344273353" watchObservedRunningTime="2026-01-05 13:49:58.079563842 +0000 UTC m=+47.386472421" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.091923 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.093199 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.593186227 +0000 UTC m=+47.900094806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.097664 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zhjv" podStartSLOduration=19.097644422 podStartE2EDuration="19.097644422s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:58.096369529 +0000 UTC m=+47.403278108" watchObservedRunningTime="2026-01-05 13:49:58.097644422 +0000 UTC m=+47.404553001" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.097976 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" event={"ID":"5b944b6c-4c73-4527-8d67-b9a2a4004606","Type":"ContainerStarted","Data":"74e1ac98f3d65e97725ba206050810ae32aa33199fc0b0edcbef70995f8deeda"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.098423 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" event={"ID":"5b944b6c-4c73-4527-8d67-b9a2a4004606","Type":"ContainerStarted","Data":"660396d65059dbe8e17b6041828f78f7603e304809316e7e28a8a8deec042322"} Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.112860 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.115258 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.134394 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kkdl5" podStartSLOduration=19.134378319 podStartE2EDuration="19.134378319s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:58.129722258 +0000 UTC m=+47.436630837" watchObservedRunningTime="2026-01-05 13:49:58.134378319 +0000 UTC m=+47.441286898" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.150739 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:49:58 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:49:58 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:49:58 crc kubenswrapper[4740]: healthz check failed Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.150800 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.192795 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.194860 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.694839332 +0000 UTC m=+48.001747911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.299738 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.300020 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.80000781 +0000 UTC m=+48.106916399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.400492 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.400796 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:58.900782372 +0000 UTC m=+48.207690951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.502054 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.502396 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.002380026 +0000 UTC m=+48.309288595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.602575 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.602707 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.102672707 +0000 UTC m=+48.409581276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.602873 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.603122 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.103114618 +0000 UTC m=+48.410023197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.704125 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.704442 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.204397604 +0000 UTC m=+48.511306183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.805783 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.806194 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.306178932 +0000 UTC m=+48.613087521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.913979 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.914226 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.414195484 +0000 UTC m=+48.721104063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.914399 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:58 crc kubenswrapper[4740]: E0105 13:49:58.914745 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.414737618 +0000 UTC m=+48.721646197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dqws6" (UID: "f488db4f-55ab-4654-b225-38742d36877c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:58 crc kubenswrapper[4740]: I0105 13:49:58.969052 4740 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.015211 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:59 crc kubenswrapper[4740]: E0105 13:49:59.015620 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-05 13:49:59.515602483 +0000 UTC m=+48.822511062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.080424 4740 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-05T13:49:58.969101003Z","Handler":null,"Name":""} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.085689 4740 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.085716 4740 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.114708 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" event={"ID":"3661c83f-42d5-4441-95c9-bc94757cc85d","Type":"ContainerStarted","Data":"2e3d99026171595082913125b9fb27f8f794b8f6c3c9441e89dceaf2fd90641f"} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.114765 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" event={"ID":"3661c83f-42d5-4441-95c9-bc94757cc85d","Type":"ContainerStarted","Data":"586616db47f6f24b8859e15b73e13dc92206c1f720031333f08ea53db5ddce6d"} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.116476 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.119300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" event={"ID":"9656e5b3-4b38-4441-b2e2-13d3d2a31100","Type":"ContainerStarted","Data":"4644006afc74f856988cb9401a9f70a66e6646d038d8081d00f1139d91c803b9"} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.121870 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5qztt" event={"ID":"b83a0bd7-6f44-4045-bb8c-e80f10959714","Type":"ContainerStarted","Data":"b225fbbac721d41e803f73c485aa0b11c8d694b2edf0c167e0dd19a89332b5da"} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.123640 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" event={"ID":"5b944b6c-4c73-4527-8d67-b9a2a4004606","Type":"ContainerStarted","Data":"835c5819500e9aadaf0f228bc84d3a717a027d887768e22ebbc519f120ff055d"} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.128347 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ppzj9" event={"ID":"c9654c8c-9b4e-4f40-8226-ada3fccc7b28","Type":"ContainerStarted","Data":"1096e9c1d0ff44849e126134653c6863c48f3c16ac49c1d5ce3efa593c120a9d"} Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.129744 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ppzj9" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.131073 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" gracePeriod=30 Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.134946 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.138283 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:49:59 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:49:59 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:49:59 crc kubenswrapper[4740]: healthz check failed Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.138356 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.145079 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qvxnd" podStartSLOduration=20.145046592 podStartE2EDuration="20.145046592s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:59.142361832 +0000 UTC m=+48.449270411" watchObservedRunningTime="2026-01-05 13:49:59.145046592 +0000 UTC m=+48.451955171" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.157188 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.157253 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.192103 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-67brg" podStartSLOduration=20.192082486 podStartE2EDuration="20.192082486s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:59.190699821 +0000 UTC m=+48.497608410" watchObservedRunningTime="2026-01-05 13:49:59.192082486 +0000 UTC m=+48.498991075" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.211666 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ppzj9" podStartSLOduration=8.211648485 podStartE2EDuration="8.211648485s" podCreationTimestamp="2026-01-05 13:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:59.207705173 +0000 UTC m=+48.514613742" watchObservedRunningTime="2026-01-05 13:49:59.211648485 +0000 UTC m=+48.518557064" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.248838 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5qztt" podStartSLOduration=20.248818923 podStartE2EDuration="20.248818923s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:49:59.246925584 +0000 UTC m=+48.553834163" watchObservedRunningTime="2026-01-05 13:49:59.248818923 +0000 UTC m=+48.555727502" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.396168 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dqws6\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.424047 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.445840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.474131 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.692713 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dqws6"] Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.771024 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k8t6t"] Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.772179 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.773759 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.784565 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8t6t"] Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.932478 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-catalog-content\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.932535 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-utilities\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.932570 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8c5z\" (UniqueName: \"kubernetes.io/projected/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-kube-api-access-x8c5z\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.971861 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pkbs6"] Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.973088 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.975823 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 05 13:49:59 crc kubenswrapper[4740]: I0105 13:49:59.981232 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pkbs6"] Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.034221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-catalog-content\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.034359 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-utilities\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.034463 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8c5z\" (UniqueName: \"kubernetes.io/projected/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-kube-api-access-x8c5z\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.035678 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-catalog-content\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.036127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-utilities\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.053535 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8c5z\" (UniqueName: \"kubernetes.io/projected/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-kube-api-access-x8c5z\") pod \"certified-operators-k8t6t\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.100521 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.135217 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/8655819a-f2c2-4044-85e1-84f7e64cff21-kube-api-access-6kfdw\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.135258 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-catalog-content\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.135275 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-utilities\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.137211 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" event={"ID":"3661c83f-42d5-4441-95c9-bc94757cc85d","Type":"ContainerStarted","Data":"52fdd9df6ed8913dbd29222815d40541e87bf1c040220110f191686222e78345"} Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.137255 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" event={"ID":"3661c83f-42d5-4441-95c9-bc94757cc85d","Type":"ContainerStarted","Data":"56a34e9a2394fdfed89422b6d6891f9d83ccaab52075ca6b00934aa40c91837a"} Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.139301 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:50:00 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:50:00 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:50:00 crc kubenswrapper[4740]: healthz check failed Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.139358 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.141619 4740 generic.go:334] "Generic (PLEG): container finished" podID="409599e0-5f32-4b72-9c6a-73c9d9d4cc63" containerID="6376a4ccdd7d44d6d9831a5b4572a41715dbcb7f2d9c609a98891f8118d7abee" exitCode=0 Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.141688 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" event={"ID":"409599e0-5f32-4b72-9c6a-73c9d9d4cc63","Type":"ContainerDied","Data":"6376a4ccdd7d44d6d9831a5b4572a41715dbcb7f2d9c609a98891f8118d7abee"} Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.144532 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" event={"ID":"f488db4f-55ab-4654-b225-38742d36877c","Type":"ContainerStarted","Data":"4ff63df9dc8111a76a79899c7d8931edc8bc2574a9da763bdad4b3b0691ea8d1"} Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.144611 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" event={"ID":"f488db4f-55ab-4654-b225-38742d36877c","Type":"ContainerStarted","Data":"42104e33f0a574f91773b7df63da715a1bc79ab1e6c279c64adbcc175fbd83ca"} Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.162551 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" podStartSLOduration=10.162535104 podStartE2EDuration="10.162535104s" podCreationTimestamp="2026-01-05 13:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:00.162360299 +0000 UTC m=+49.469268938" watchObservedRunningTime="2026-01-05 13:50:00.162535104 +0000 UTC m=+49.469443683" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.179668 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zvjqq"] Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.181184 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.183431 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvjqq"] Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.191360 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" podStartSLOduration=21.191343123 podStartE2EDuration="21.191343123s" podCreationTimestamp="2026-01-05 13:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:00.187716119 +0000 UTC m=+49.494624698" watchObservedRunningTime="2026-01-05 13:50:00.191343123 +0000 UTC m=+49.498251702" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.237692 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/8655819a-f2c2-4044-85e1-84f7e64cff21-kube-api-access-6kfdw\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.238918 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-catalog-content\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.238982 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-utilities\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.239277 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-catalog-content\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.239365 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-utilities\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.259979 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/8655819a-f2c2-4044-85e1-84f7e64cff21-kube-api-access-6kfdw\") pod \"community-operators-pkbs6\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.293676 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.333007 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8t6t"] Jan 05 13:50:00 crc kubenswrapper[4740]: W0105 13:50:00.335336 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2dbc1ff_ca19_4faf_abb5_3b4fe99918f2.slice/crio-b0249a45d4de176b7d276028e1fb74762afbe6b44a0d121ee2138ccdcad052bd WatchSource:0}: Error finding container b0249a45d4de176b7d276028e1fb74762afbe6b44a0d121ee2138ccdcad052bd: Status 404 returned error can't find the container with id b0249a45d4de176b7d276028e1fb74762afbe6b44a0d121ee2138ccdcad052bd Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.339781 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-utilities\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.339826 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-catalog-content\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.339865 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdnr\" (UniqueName: \"kubernetes.io/projected/888ada4e-9063-44ac-b293-8de842edb998-kube-api-access-pcdnr\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.368296 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hfzkk"] Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.369182 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.377177 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfzkk"] Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.441144 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-utilities\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.441177 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-catalog-content\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.441220 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdnr\" (UniqueName: \"kubernetes.io/projected/888ada4e-9063-44ac-b293-8de842edb998-kube-api-access-pcdnr\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.443096 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-utilities\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.443138 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-catalog-content\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.467276 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdnr\" (UniqueName: \"kubernetes.io/projected/888ada4e-9063-44ac-b293-8de842edb998-kube-api-access-pcdnr\") pod \"certified-operators-zvjqq\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.495779 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pkbs6"] Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.507184 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.542658 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l8db\" (UniqueName: \"kubernetes.io/projected/f781df9d-cacf-433b-a588-35f67af41b66-kube-api-access-2l8db\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.542709 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-catalog-content\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.542738 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-utilities\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.644466 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l8db\" (UniqueName: \"kubernetes.io/projected/f781df9d-cacf-433b-a588-35f67af41b66-kube-api-access-2l8db\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.644569 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-catalog-content\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.644643 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-utilities\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.645133 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-catalog-content\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.645180 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-utilities\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.661938 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l8db\" (UniqueName: \"kubernetes.io/projected/f781df9d-cacf-433b-a588-35f67af41b66-kube-api-access-2l8db\") pod \"community-operators-hfzkk\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.673126 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvjqq"] Jan 05 13:50:00 crc kubenswrapper[4740]: W0105 13:50:00.682972 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888ada4e_9063_44ac_b293_8de842edb998.slice/crio-efc8661ab01d118652dbf546c4dcb0ff87250da3c1fb90ff4438c03d4ca6e65d WatchSource:0}: Error finding container efc8661ab01d118652dbf546c4dcb0ff87250da3c1fb90ff4438c03d4ca6e65d: Status 404 returned error can't find the container with id efc8661ab01d118652dbf546c4dcb0ff87250da3c1fb90ff4438c03d4ca6e65d Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.765214 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.975896 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 05 13:50:00 crc kubenswrapper[4740]: I0105 13:50:00.985427 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfzkk"] Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.137782 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:50:01 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:50:01 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:50:01 crc kubenswrapper[4740]: healthz check failed Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.138476 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.151620 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.152310 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.155200 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.155330 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.159701 4740 generic.go:334] "Generic (PLEG): container finished" podID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerID="036bbad9431ac4e1211dcc21964f324b485183cae176769c96320fbd14331c2d" exitCode=0 Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.159811 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkbs6" event={"ID":"8655819a-f2c2-4044-85e1-84f7e64cff21","Type":"ContainerDied","Data":"036bbad9431ac4e1211dcc21964f324b485183cae176769c96320fbd14331c2d"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.160004 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkbs6" event={"ID":"8655819a-f2c2-4044-85e1-84f7e64cff21","Type":"ContainerStarted","Data":"ed64ace77ae6f75d8e5fee98407219ba72324b09afb7027b8fb4824cb51f4e7d"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.162634 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.164504 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.164549 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerStarted","Data":"5c5079391f83cf46ba0ddce6c18ddce4a9d5076d15b7e1cbe564018cafec9e62"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.164567 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerStarted","Data":"8701e5af45eb2ab53fc58cce5fbb314c972dff346f2be237dfa938856f3949aa"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.170836 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjqq" event={"ID":"888ada4e-9063-44ac-b293-8de842edb998","Type":"ContainerDied","Data":"b631471044aee78f3aff65f38298ad87bfc01cb238f28d24a2e6aad93e63e294"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.171481 4740 generic.go:334] "Generic (PLEG): container finished" podID="888ada4e-9063-44ac-b293-8de842edb998" containerID="b631471044aee78f3aff65f38298ad87bfc01cb238f28d24a2e6aad93e63e294" exitCode=0 Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.171588 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjqq" event={"ID":"888ada4e-9063-44ac-b293-8de842edb998","Type":"ContainerStarted","Data":"efc8661ab01d118652dbf546c4dcb0ff87250da3c1fb90ff4438c03d4ca6e65d"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.190824 4740 generic.go:334] "Generic (PLEG): container finished" podID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerID="5009012fd01d883db1c9431ad61841bb79087491f8055c89eea17d11cc2b6f18" exitCode=0 Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.190877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerDied","Data":"5009012fd01d883db1c9431ad61841bb79087491f8055c89eea17d11cc2b6f18"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.190929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerStarted","Data":"b0249a45d4de176b7d276028e1fb74762afbe6b44a0d121ee2138ccdcad052bd"} Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.191056 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.252255 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.252307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.340549 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.340795 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.346153 4740 patch_prober.go:28] interesting pod/console-f9d7485db-qj9kj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.346380 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qj9kj" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.354815 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.354864 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.354942 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.408669 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.546596 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.660957 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tjvv9" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.665801 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.714961 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.725353 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jvn7z" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.761423 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-config-volume\") pod \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.761676 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj5cn\" (UniqueName: \"kubernetes.io/projected/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-kube-api-access-cj5cn\") pod \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.761797 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-secret-volume\") pod \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\" (UID: \"409599e0-5f32-4b72-9c6a-73c9d9d4cc63\") " Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.763449 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-config-volume" (OuterVolumeSpecName: "config-volume") pod "409599e0-5f32-4b72-9c6a-73c9d9d4cc63" (UID: "409599e0-5f32-4b72-9c6a-73c9d9d4cc63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.775256 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-kube-api-access-cj5cn" (OuterVolumeSpecName: "kube-api-access-cj5cn") pod "409599e0-5f32-4b72-9c6a-73c9d9d4cc63" (UID: "409599e0-5f32-4b72-9c6a-73c9d9d4cc63"). InnerVolumeSpecName "kube-api-access-cj5cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.777255 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "409599e0-5f32-4b72-9c6a-73c9d9d4cc63" (UID: "409599e0-5f32-4b72-9c6a-73c9d9d4cc63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.788293 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-br6rk"] Jan 05 13:50:01 crc kubenswrapper[4740]: E0105 13:50:01.788711 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409599e0-5f32-4b72-9c6a-73c9d9d4cc63" containerName="collect-profiles" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.788785 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="409599e0-5f32-4b72-9c6a-73c9d9d4cc63" containerName="collect-profiles" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.788975 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="409599e0-5f32-4b72-9c6a-73c9d9d4cc63" containerName="collect-profiles" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.789734 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.792036 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.813327 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-br6rk"] Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.863403 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-utilities\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.863842 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-catalog-content\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.863921 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplgh\" (UniqueName: \"kubernetes.io/projected/c70970cb-3f87-4296-96c0-c827e79eaee6-kube-api-access-fplgh\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.864093 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.864175 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj5cn\" (UniqueName: \"kubernetes.io/projected/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-kube-api-access-cj5cn\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.864232 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/409599e0-5f32-4b72-9c6a-73c9d9d4cc63-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.939735 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 05 13:50:01 crc kubenswrapper[4740]: W0105 13:50:01.954768 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8e96f2f8_924c_4627_8e20_1e6153e47ce6.slice/crio-fa931b5cc91a5ab66a5f3e9d48be198e684dcaebc5d497fd1c6236ea3015aebb WatchSource:0}: Error finding container fa931b5cc91a5ab66a5f3e9d48be198e684dcaebc5d497fd1c6236ea3015aebb: Status 404 returned error can't find the container with id fa931b5cc91a5ab66a5f3e9d48be198e684dcaebc5d497fd1c6236ea3015aebb Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.965355 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-utilities\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.965448 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-catalog-content\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.965469 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fplgh\" (UniqueName: \"kubernetes.io/projected/c70970cb-3f87-4296-96c0-c827e79eaee6-kube-api-access-fplgh\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.966030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-utilities\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.966317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-catalog-content\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:01 crc kubenswrapper[4740]: I0105 13:50:01.983724 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplgh\" (UniqueName: \"kubernetes.io/projected/c70970cb-3f87-4296-96c0-c827e79eaee6-kube-api-access-fplgh\") pod \"redhat-marketplace-br6rk\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.103680 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.137414 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:50:02 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:50:02 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:50:02 crc kubenswrapper[4740]: healthz check failed Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.137476 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.174726 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-smpqb"] Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.175625 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.199092 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-smpqb"] Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.209721 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e96f2f8-924c-4627-8e20-1e6153e47ce6","Type":"ContainerStarted","Data":"fa931b5cc91a5ab66a5f3e9d48be198e684dcaebc5d497fd1c6236ea3015aebb"} Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.235690 4740 generic.go:334] "Generic (PLEG): container finished" podID="f781df9d-cacf-433b-a588-35f67af41b66" containerID="5c5079391f83cf46ba0ddce6c18ddce4a9d5076d15b7e1cbe564018cafec9e62" exitCode=0 Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.235759 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerDied","Data":"5c5079391f83cf46ba0ddce6c18ddce4a9d5076d15b7e1cbe564018cafec9e62"} Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.239913 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" event={"ID":"409599e0-5f32-4b72-9c6a-73c9d9d4cc63","Type":"ContainerDied","Data":"5e716e651f2ddd84d7825e9cbe10090854eead82a42bfced0a07cec444324464"} Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.239979 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e716e651f2ddd84d7825e9cbe10090854eead82a42bfced0a07cec444324464" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.240100 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.272941 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8t8\" (UniqueName: \"kubernetes.io/projected/370cecdf-191e-4773-914f-eac3264601e4-kube-api-access-cq8t8\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.273138 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-catalog-content\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.273296 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-utilities\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.341626 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-br6rk"] Jan 05 13:50:02 crc kubenswrapper[4740]: W0105 13:50:02.351736 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70970cb_3f87_4296_96c0_c827e79eaee6.slice/crio-5d1c561f4db4c1ef45aa873f85d1a76978e1fc5fc0b0d71e2e1dcfdb854891e8 WatchSource:0}: Error finding container 5d1c561f4db4c1ef45aa873f85d1a76978e1fc5fc0b0d71e2e1dcfdb854891e8: Status 404 returned error can't find the container with id 5d1c561f4db4c1ef45aa873f85d1a76978e1fc5fc0b0d71e2e1dcfdb854891e8 Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.374873 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-utilities\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.374956 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8t8\" (UniqueName: \"kubernetes.io/projected/370cecdf-191e-4773-914f-eac3264601e4-kube-api-access-cq8t8\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.375161 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-catalog-content\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.375493 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-utilities\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.375911 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-catalog-content\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.395611 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8t8\" (UniqueName: \"kubernetes.io/projected/370cecdf-191e-4773-914f-eac3264601e4-kube-api-access-cq8t8\") pod \"redhat-marketplace-smpqb\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:02 crc kubenswrapper[4740]: I0105 13:50:02.507016 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.001001 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-scltx"] Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.006413 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scltx"] Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.006452 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-smpqb"] Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.006550 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.009709 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.083604 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-utilities\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.083693 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbcbk\" (UniqueName: \"kubernetes.io/projected/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-kube-api-access-pbcbk\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.084103 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-catalog-content\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.137416 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:50:03 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:50:03 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:50:03 crc kubenswrapper[4740]: healthz check failed Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.137490 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.185897 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-utilities\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.185954 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbcbk\" (UniqueName: \"kubernetes.io/projected/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-kube-api-access-pbcbk\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.186042 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-catalog-content\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.186547 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-utilities\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.186607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-catalog-content\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.209803 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbcbk\" (UniqueName: \"kubernetes.io/projected/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-kube-api-access-pbcbk\") pod \"redhat-operators-scltx\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.259627 4740 generic.go:334] "Generic (PLEG): container finished" podID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerID="3d5e70b859486172c4f98f41c8dee74d320a3027dccfc05fe536fb4987eeecf8" exitCode=0 Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.259672 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerDied","Data":"3d5e70b859486172c4f98f41c8dee74d320a3027dccfc05fe536fb4987eeecf8"} Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.259732 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerStarted","Data":"5d1c561f4db4c1ef45aa873f85d1a76978e1fc5fc0b0d71e2e1dcfdb854891e8"} Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.263537 4740 generic.go:334] "Generic (PLEG): container finished" podID="8e96f2f8-924c-4627-8e20-1e6153e47ce6" containerID="a35918f8a48c4f192cd1452af76b50594542d4245db3aaea8b315c699cf042ca" exitCode=0 Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.263627 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e96f2f8-924c-4627-8e20-1e6153e47ce6","Type":"ContainerDied","Data":"a35918f8a48c4f192cd1452af76b50594542d4245db3aaea8b315c699cf042ca"} Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.265244 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerStarted","Data":"2831345f81928149114d5110dcaf5dbeb2040ffc393e4c5ce74075ea4def5480"} Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.265293 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerStarted","Data":"10c26ef3179558ac98a2671e5ca614258c10f86e6b117b91d3eb72a0227e6821"} Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.358468 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.374341 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwx8f"] Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.375530 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.390766 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwx8f"] Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.496744 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-catalog-content\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.496798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf4rb\" (UniqueName: \"kubernetes.io/projected/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-kube-api-access-nf4rb\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.496998 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-utilities\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.600455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-utilities\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.600545 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-catalog-content\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.600576 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf4rb\" (UniqueName: \"kubernetes.io/projected/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-kube-api-access-nf4rb\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.602793 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-utilities\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.603030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-catalog-content\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.610893 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-scltx"] Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.628686 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf4rb\" (UniqueName: \"kubernetes.io/projected/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-kube-api-access-nf4rb\") pod \"redhat-operators-zwx8f\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:03 crc kubenswrapper[4740]: I0105 13:50:03.761127 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:50:04 crc kubenswrapper[4740]: E0105 13:50:03.999427 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:04 crc kubenswrapper[4740]: E0105 13:50:04.001422 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:04 crc kubenswrapper[4740]: E0105 13:50:04.004847 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:04 crc kubenswrapper[4740]: E0105 13:50:04.004912 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.134398 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.138320 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 13:50:04 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 13:50:04 crc kubenswrapper[4740]: [+]process-running ok Jan 05 13:50:04 crc kubenswrapper[4740]: healthz check failed Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.138396 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.258107 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwx8f"] Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.289291 4740 generic.go:334] "Generic (PLEG): container finished" podID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerID="9efc908e0ab789a4125e18ba7ab335f178a98495d053fd5976a5a12d6009cb12" exitCode=0 Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.289365 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerDied","Data":"9efc908e0ab789a4125e18ba7ab335f178a98495d053fd5976a5a12d6009cb12"} Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.289430 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerStarted","Data":"af82b9304183f096c9a7b2e44a6bc0ad784085ab3ebc7be6a888c90ca84d35f6"} Jan 05 13:50:04 crc kubenswrapper[4740]: W0105 13:50:04.290183 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a6ab9f1_e8a1_4aac_84ea_059414ff1e31.slice/crio-0e7fa24def0d99b2bba2cc3a47e74b93ab067b924f07823a98e5e26670fb0800 WatchSource:0}: Error finding container 0e7fa24def0d99b2bba2cc3a47e74b93ab067b924f07823a98e5e26670fb0800: Status 404 returned error can't find the container with id 0e7fa24def0d99b2bba2cc3a47e74b93ab067b924f07823a98e5e26670fb0800 Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.294573 4740 generic.go:334] "Generic (PLEG): container finished" podID="370cecdf-191e-4773-914f-eac3264601e4" containerID="2831345f81928149114d5110dcaf5dbeb2040ffc393e4c5ce74075ea4def5480" exitCode=0 Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.294620 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerDied","Data":"2831345f81928149114d5110dcaf5dbeb2040ffc393e4c5ce74075ea4def5480"} Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.668400 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.742665 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kubelet-dir\") pod \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.742757 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kube-api-access\") pod \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\" (UID: \"8e96f2f8-924c-4627-8e20-1e6153e47ce6\") " Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.744428 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e96f2f8-924c-4627-8e20-1e6153e47ce6" (UID: "8e96f2f8-924c-4627-8e20-1e6153e47ce6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.767722 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e96f2f8-924c-4627-8e20-1e6153e47ce6" (UID: "8e96f2f8-924c-4627-8e20-1e6153e47ce6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.848408 4740 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:04 crc kubenswrapper[4740]: I0105 13:50:04.848462 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e96f2f8-924c-4627-8e20-1e6153e47ce6-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.136957 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.139607 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.307145 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.307308 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e96f2f8-924c-4627-8e20-1e6153e47ce6","Type":"ContainerDied","Data":"fa931b5cc91a5ab66a5f3e9d48be198e684dcaebc5d497fd1c6236ea3015aebb"} Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.307354 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa931b5cc91a5ab66a5f3e9d48be198e684dcaebc5d497fd1c6236ea3015aebb" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.321386 4740 generic.go:334] "Generic (PLEG): container finished" podID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerID="1555cca65e0ba43b9ec762bb13cf2a5879339f5a1aaaec3f9c78fe3c10bb2319" exitCode=0 Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.322424 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwx8f" event={"ID":"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31","Type":"ContainerDied","Data":"1555cca65e0ba43b9ec762bb13cf2a5879339f5a1aaaec3f9c78fe3c10bb2319"} Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.322448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwx8f" event={"ID":"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31","Type":"ContainerStarted","Data":"0e7fa24def0d99b2bba2cc3a47e74b93ab067b924f07823a98e5e26670fb0800"} Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.864455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.864507 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.864550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.864574 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.865496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.872914 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.873588 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.891648 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.939533 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 05 13:50:05 crc kubenswrapper[4740]: E0105 13:50:05.939755 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e96f2f8-924c-4627-8e20-1e6153e47ce6" containerName="pruner" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.939768 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e96f2f8-924c-4627-8e20-1e6153e47ce6" containerName="pruner" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.939872 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e96f2f8-924c-4627-8e20-1e6153e47ce6" containerName="pruner" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.947195 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.950469 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.951123 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.955421 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.966158 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26fa6c73-586f-4587-8ba7-581126d1b072-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:05 crc kubenswrapper[4740]: I0105 13:50:05.966194 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26fa6c73-586f-4587-8ba7-581126d1b072-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.071811 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26fa6c73-586f-4587-8ba7-581126d1b072-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.071868 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26fa6c73-586f-4587-8ba7-581126d1b072-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.072079 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26fa6c73-586f-4587-8ba7-581126d1b072-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.084401 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.100884 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.102304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26fa6c73-586f-4587-8ba7-581126d1b072-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.107492 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.269486 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:06 crc kubenswrapper[4740]: I0105 13:50:06.644443 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 05 13:50:06 crc kubenswrapper[4740]: W0105 13:50:06.716022 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-0787ebee121d0cedafe99897f45b03010351e49efb15d5aaf7ef335c2488f5fd WatchSource:0}: Error finding container 0787ebee121d0cedafe99897f45b03010351e49efb15d5aaf7ef335c2488f5fd: Status 404 returned error can't find the container with id 0787ebee121d0cedafe99897f45b03010351e49efb15d5aaf7ef335c2488f5fd Jan 05 13:50:06 crc kubenswrapper[4740]: W0105 13:50:06.726995 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-985da1bcb076a4a89f198eb786f2d99dc5325086ca750d575f792d2848911bad WatchSource:0}: Error finding container 985da1bcb076a4a89f198eb786f2d99dc5325086ca750d575f792d2848911bad: Status 404 returned error can't find the container with id 985da1bcb076a4a89f198eb786f2d99dc5325086ca750d575f792d2848911bad Jan 05 13:50:06 crc kubenswrapper[4740]: W0105 13:50:06.736798 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-250e443d724f3b8411b5fda441218a938ad831ba09b6ef236a16c8b12e2ee201 WatchSource:0}: Error finding container 250e443d724f3b8411b5fda441218a938ad831ba09b6ef236a16c8b12e2ee201: Status 404 returned error can't find the container with id 250e443d724f3b8411b5fda441218a938ad831ba09b6ef236a16c8b12e2ee201 Jan 05 13:50:07 crc kubenswrapper[4740]: I0105 13:50:07.342030 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26fa6c73-586f-4587-8ba7-581126d1b072","Type":"ContainerStarted","Data":"d7c44bdd9d4826e722bbb6f36d85abc5595934c9ef15e8118c00b1a246d652cf"} Jan 05 13:50:07 crc kubenswrapper[4740]: I0105 13:50:07.348994 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0787ebee121d0cedafe99897f45b03010351e49efb15d5aaf7ef335c2488f5fd"} Jan 05 13:50:07 crc kubenswrapper[4740]: I0105 13:50:07.359194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"250e443d724f3b8411b5fda441218a938ad831ba09b6ef236a16c8b12e2ee201"} Jan 05 13:50:07 crc kubenswrapper[4740]: I0105 13:50:07.361938 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"985da1bcb076a4a89f198eb786f2d99dc5325086ca750d575f792d2848911bad"} Jan 05 13:50:08 crc kubenswrapper[4740]: I0105 13:50:08.388340 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a40009bd220c3297e67c2caacf66cff87440364d0865a3f32a300b74f1ba6f2f"} Jan 05 13:50:08 crc kubenswrapper[4740]: I0105 13:50:08.397904 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2e4350be36f0dfaa06dba40adf44007a18c8624923c11291c69363597e8b5999"} Jan 05 13:50:08 crc kubenswrapper[4740]: I0105 13:50:08.415264 4740 generic.go:334] "Generic (PLEG): container finished" podID="26fa6c73-586f-4587-8ba7-581126d1b072" containerID="b9e4f477f9308f2989363b2f179c8cc001d875465b7f0071d7f65798bd933e58" exitCode=0 Jan 05 13:50:08 crc kubenswrapper[4740]: I0105 13:50:08.415374 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26fa6c73-586f-4587-8ba7-581126d1b072","Type":"ContainerDied","Data":"b9e4f477f9308f2989363b2f179c8cc001d875465b7f0071d7f65798bd933e58"} Jan 05 13:50:08 crc kubenswrapper[4740]: I0105 13:50:08.419873 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"809413516c981f17dd15ad421a0e3664002ffe7f2c06c17c833522c742eeea48"} Jan 05 13:50:08 crc kubenswrapper[4740]: I0105 13:50:08.419968 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.096013 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ppzj9" Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.661818 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.855005 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26fa6c73-586f-4587-8ba7-581126d1b072-kubelet-dir\") pod \"26fa6c73-586f-4587-8ba7-581126d1b072\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.855093 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26fa6c73-586f-4587-8ba7-581126d1b072-kube-api-access\") pod \"26fa6c73-586f-4587-8ba7-581126d1b072\" (UID: \"26fa6c73-586f-4587-8ba7-581126d1b072\") " Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.855168 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26fa6c73-586f-4587-8ba7-581126d1b072-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "26fa6c73-586f-4587-8ba7-581126d1b072" (UID: "26fa6c73-586f-4587-8ba7-581126d1b072"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.855433 4740 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26fa6c73-586f-4587-8ba7-581126d1b072-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.864096 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fa6c73-586f-4587-8ba7-581126d1b072-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "26fa6c73-586f-4587-8ba7-581126d1b072" (UID: "26fa6c73-586f-4587-8ba7-581126d1b072"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:09 crc kubenswrapper[4740]: I0105 13:50:09.956552 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26fa6c73-586f-4587-8ba7-581126d1b072-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:10 crc kubenswrapper[4740]: I0105 13:50:10.439088 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"26fa6c73-586f-4587-8ba7-581126d1b072","Type":"ContainerDied","Data":"d7c44bdd9d4826e722bbb6f36d85abc5595934c9ef15e8118c00b1a246d652cf"} Jan 05 13:50:10 crc kubenswrapper[4740]: I0105 13:50:10.439122 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7c44bdd9d4826e722bbb6f36d85abc5595934c9ef15e8118c00b1a246d652cf" Jan 05 13:50:10 crc kubenswrapper[4740]: I0105 13:50:10.439135 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 05 13:50:10 crc kubenswrapper[4740]: I0105 13:50:10.818717 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:50:11 crc kubenswrapper[4740]: I0105 13:50:11.338983 4740 patch_prober.go:28] interesting pod/console-f9d7485db-qj9kj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 05 13:50:11 crc kubenswrapper[4740]: I0105 13:50:11.339676 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qj9kj" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.5:8443/health\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 05 13:50:13 crc kubenswrapper[4740]: E0105 13:50:13.992476 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:13 crc kubenswrapper[4740]: E0105 13:50:13.993883 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:13 crc kubenswrapper[4740]: E0105 13:50:13.994965 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:13 crc kubenswrapper[4740]: E0105 13:50:13.995045 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:14 crc kubenswrapper[4740]: I0105 13:50:14.570742 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5cdg"] Jan 05 13:50:14 crc kubenswrapper[4740]: I0105 13:50:14.571553 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerName="controller-manager" containerID="cri-o://a7e9eb90bacd6d3a4f33afac26d9f1927ef9103dffee4c458fcd269d9d9ce11c" gracePeriod=30 Jan 05 13:50:14 crc kubenswrapper[4740]: I0105 13:50:14.673224 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475"] Jan 05 13:50:14 crc kubenswrapper[4740]: I0105 13:50:14.673456 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerName="route-controller-manager" containerID="cri-o://20179e62d0bc0fe542f80a38a88a74790d63cf1372f5ac028d2d5381f64d3d85" gracePeriod=30 Jan 05 13:50:15 crc kubenswrapper[4740]: I0105 13:50:15.465205 4740 generic.go:334] "Generic (PLEG): container finished" podID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerID="20179e62d0bc0fe542f80a38a88a74790d63cf1372f5ac028d2d5381f64d3d85" exitCode=0 Jan 05 13:50:15 crc kubenswrapper[4740]: I0105 13:50:15.465306 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" event={"ID":"577f43f3-8470-4de3-ab3b-9934f1deab62","Type":"ContainerDied","Data":"20179e62d0bc0fe542f80a38a88a74790d63cf1372f5ac028d2d5381f64d3d85"} Jan 05 13:50:15 crc kubenswrapper[4740]: I0105 13:50:15.467174 4740 generic.go:334] "Generic (PLEG): container finished" podID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerID="a7e9eb90bacd6d3a4f33afac26d9f1927ef9103dffee4c458fcd269d9d9ce11c" exitCode=0 Jan 05 13:50:15 crc kubenswrapper[4740]: I0105 13:50:15.467214 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" event={"ID":"3899b43b-6fe3-4a6d-9434-9a4754669370","Type":"ContainerDied","Data":"a7e9eb90bacd6d3a4f33afac26d9f1927ef9103dffee4c458fcd269d9d9ce11c"} Jan 05 13:50:19 crc kubenswrapper[4740]: I0105 13:50:19.478307 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:50:21 crc kubenswrapper[4740]: I0105 13:50:21.343379 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:50:21 crc kubenswrapper[4740]: I0105 13:50:21.348877 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:50:22 crc kubenswrapper[4740]: I0105 13:50:22.433419 4740 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bz475 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 13:50:22 crc kubenswrapper[4740]: I0105 13:50:22.433532 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 13:50:22 crc kubenswrapper[4740]: I0105 13:50:22.591349 4740 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z5cdg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: i/o timeout" start-of-body= Jan 05 13:50:22 crc kubenswrapper[4740]: I0105 13:50:22.591417 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: i/o timeout" Jan 05 13:50:22 crc kubenswrapper[4740]: I0105 13:50:22.611662 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 05 13:50:23 crc kubenswrapper[4740]: E0105 13:50:23.993033 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:23 crc kubenswrapper[4740]: E0105 13:50:23.994716 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:23 crc kubenswrapper[4740]: E0105 13:50:23.996121 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:23 crc kubenswrapper[4740]: E0105 13:50:23.996188 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:28 crc kubenswrapper[4740]: E0105 13:50:28.940163 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 05 13:50:28 crc kubenswrapper[4740]: E0105 13:50:28.940608 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kfdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pkbs6_openshift-marketplace(8655819a-f2c2-4044-85e1-84f7e64cff21): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:28 crc kubenswrapper[4740]: E0105 13:50:28.941802 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pkbs6" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" Jan 05 13:50:29 crc kubenswrapper[4740]: I0105 13:50:29.643502 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lg5bc_1759d33c-e10d-4db8-91aa-60a30ff68255/kube-multus-additional-cni-plugins/0.log" Jan 05 13:50:29 crc kubenswrapper[4740]: I0105 13:50:29.643823 4740 generic.go:334] "Generic (PLEG): container finished" podID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" exitCode=137 Jan 05 13:50:29 crc kubenswrapper[4740]: I0105 13:50:29.643976 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" event={"ID":"1759d33c-e10d-4db8-91aa-60a30ff68255","Type":"ContainerDied","Data":"a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179"} Jan 05 13:50:29 crc kubenswrapper[4740]: I0105 13:50:29.665424 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.665401176 podStartE2EDuration="7.665401176s" podCreationTimestamp="2026-01-05 13:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:29.656145738 +0000 UTC m=+78.963054397" watchObservedRunningTime="2026-01-05 13:50:29.665401176 +0000 UTC m=+78.972309805" Jan 05 13:50:32 crc kubenswrapper[4740]: I0105 13:50:32.434274 4740 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bz475 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: i/o timeout" start-of-body= Jan 05 13:50:32 crc kubenswrapper[4740]: I0105 13:50:32.434356 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: i/o timeout" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.497335 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pkbs6" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.518903 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.519416 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2l8db,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hfzkk_openshift-marketplace(f781df9d-cacf-433b-a588-35f67af41b66): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.520746 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hfzkk" podUID="f781df9d-cacf-433b-a588-35f67af41b66" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.566821 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.567041 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nf4rb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zwx8f_openshift-marketplace(6a6ab9f1-e8a1-4aac-84ea-059414ff1e31): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.568320 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zwx8f" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.591209 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 05 13:50:32 crc kubenswrapper[4740]: I0105 13:50:32.591310 4740 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z5cdg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 13:50:32 crc kubenswrapper[4740]: I0105 13:50:32.591388 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.591778 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbcbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-scltx_openshift-marketplace(2200bbba-e16a-4b74-90b2-9a037bfe0e7d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:32 crc kubenswrapper[4740]: E0105 13:50:32.592999 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-scltx" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" Jan 05 13:50:33 crc kubenswrapper[4740]: I0105 13:50:33.907660 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" Jan 05 13:50:33 crc kubenswrapper[4740]: E0105 13:50:33.990512 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179 is running failed: container process not found" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:33 crc kubenswrapper[4740]: E0105 13:50:33.991628 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179 is running failed: container process not found" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:33 crc kubenswrapper[4740]: E0105 13:50:33.992284 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179 is running failed: container process not found" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 05 13:50:33 crc kubenswrapper[4740]: E0105 13:50:33.992388 4740 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:38 crc kubenswrapper[4740]: E0105 13:50:38.562444 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-scltx" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" Jan 05 13:50:38 crc kubenswrapper[4740]: E0105 13:50:38.562967 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hfzkk" podUID="f781df9d-cacf-433b-a588-35f67af41b66" Jan 05 13:50:38 crc kubenswrapper[4740]: E0105 13:50:38.563087 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zwx8f" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.639530 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.679904 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69f74844c8-4rq4j"] Jan 05 13:50:38 crc kubenswrapper[4740]: E0105 13:50:38.680635 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerName="controller-manager" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.680746 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerName="controller-manager" Jan 05 13:50:38 crc kubenswrapper[4740]: E0105 13:50:38.680852 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fa6c73-586f-4587-8ba7-581126d1b072" containerName="pruner" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.680938 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fa6c73-586f-4587-8ba7-581126d1b072" containerName="pruner" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.681197 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" containerName="controller-manager" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.681300 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fa6c73-586f-4587-8ba7-581126d1b072" containerName="pruner" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.681925 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.704329 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69f74844c8-4rq4j"] Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.708575 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.709666 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z5cdg" event={"ID":"3899b43b-6fe3-4a6d-9434-9a4754669370","Type":"ContainerDied","Data":"38e5a5cd0c475a41f25bb67ef0fc813fb8b47f539c52cef9b63c998902ccb58a"} Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.709841 4740 scope.go:117] "RemoveContainer" containerID="a7e9eb90bacd6d3a4f33afac26d9f1927ef9103dffee4c458fcd269d9d9ce11c" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.743453 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/3899b43b-6fe3-4a6d-9434-9a4754669370-kube-api-access-6bmd9\") pod \"3899b43b-6fe3-4a6d-9434-9a4754669370\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.743853 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-client-ca\") pod \"3899b43b-6fe3-4a6d-9434-9a4754669370\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.743942 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-proxy-ca-bundles\") pod \"3899b43b-6fe3-4a6d-9434-9a4754669370\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744029 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-config\") pod \"3899b43b-6fe3-4a6d-9434-9a4754669370\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744113 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3899b43b-6fe3-4a6d-9434-9a4754669370-serving-cert\") pod \"3899b43b-6fe3-4a6d-9434-9a4754669370\" (UID: \"3899b43b-6fe3-4a6d-9434-9a4754669370\") " Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744376 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-proxy-ca-bundles\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744540 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r49xt\" (UniqueName: \"kubernetes.io/projected/4f2ba57f-6524-465b-aefc-bbd0f56accdc-kube-api-access-r49xt\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744611 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2ba57f-6524-465b-aefc-bbd0f56accdc-serving-cert\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744675 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-config\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744673 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-client-ca" (OuterVolumeSpecName: "client-ca") pod "3899b43b-6fe3-4a6d-9434-9a4754669370" (UID: "3899b43b-6fe3-4a6d-9434-9a4754669370"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744684 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3899b43b-6fe3-4a6d-9434-9a4754669370" (UID: "3899b43b-6fe3-4a6d-9434-9a4754669370"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744794 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-client-ca\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744871 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.744888 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.745301 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-config" (OuterVolumeSpecName: "config") pod "3899b43b-6fe3-4a6d-9434-9a4754669370" (UID: "3899b43b-6fe3-4a6d-9434-9a4754669370"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.749949 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3899b43b-6fe3-4a6d-9434-9a4754669370-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3899b43b-6fe3-4a6d-9434-9a4754669370" (UID: "3899b43b-6fe3-4a6d-9434-9a4754669370"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.753025 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3899b43b-6fe3-4a6d-9434-9a4754669370-kube-api-access-6bmd9" (OuterVolumeSpecName: "kube-api-access-6bmd9") pod "3899b43b-6fe3-4a6d-9434-9a4754669370" (UID: "3899b43b-6fe3-4a6d-9434-9a4754669370"). InnerVolumeSpecName "kube-api-access-6bmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.845558 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-client-ca\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.846591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-proxy-ca-bundles\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.846516 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-client-ca\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.846676 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r49xt\" (UniqueName: \"kubernetes.io/projected/4f2ba57f-6524-465b-aefc-bbd0f56accdc-kube-api-access-r49xt\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.847818 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-proxy-ca-bundles\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.847887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2ba57f-6524-465b-aefc-bbd0f56accdc-serving-cert\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.848288 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-config\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.848339 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3899b43b-6fe3-4a6d-9434-9a4754669370-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.848351 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3899b43b-6fe3-4a6d-9434-9a4754669370-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.848360 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/3899b43b-6fe3-4a6d-9434-9a4754669370-kube-api-access-6bmd9\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.849809 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-config\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.851922 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2ba57f-6524-465b-aefc-bbd0f56accdc-serving-cert\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:38 crc kubenswrapper[4740]: I0105 13:50:38.864924 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r49xt\" (UniqueName: \"kubernetes.io/projected/4f2ba57f-6524-465b-aefc-bbd0f56accdc-kube-api-access-r49xt\") pod \"controller-manager-69f74844c8-4rq4j\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:39 crc kubenswrapper[4740]: I0105 13:50:39.011148 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:39 crc kubenswrapper[4740]: I0105 13:50:39.044287 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5cdg"] Jan 05 13:50:39 crc kubenswrapper[4740]: I0105 13:50:39.048012 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z5cdg"] Jan 05 13:50:39 crc kubenswrapper[4740]: E0105 13:50:39.272768 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 05 13:50:39 crc kubenswrapper[4740]: E0105 13:50:39.273288 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fplgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-br6rk_openshift-marketplace(c70970cb-3f87-4296-96c0-c827e79eaee6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:39 crc kubenswrapper[4740]: E0105 13:50:39.274624 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-br6rk" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" Jan 05 13:50:39 crc kubenswrapper[4740]: E0105 13:50:39.351287 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 05 13:50:39 crc kubenswrapper[4740]: E0105 13:50:39.351546 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cq8t8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-smpqb_openshift-marketplace(370cecdf-191e-4773-914f-eac3264601e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:39 crc kubenswrapper[4740]: E0105 13:50:39.352666 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-smpqb" podUID="370cecdf-191e-4773-914f-eac3264601e4" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.327655 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.328902 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.331530 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.332190 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.332432 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.367753 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48381fb8-5fa8-48c4-be6d-322df35570b1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.367824 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48381fb8-5fa8-48c4-be6d-322df35570b1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.468501 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48381fb8-5fa8-48c4-be6d-322df35570b1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.469146 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48381fb8-5fa8-48c4-be6d-322df35570b1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.469225 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48381fb8-5fa8-48c4-be6d-322df35570b1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.487321 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48381fb8-5fa8-48c4-be6d-322df35570b1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.648429 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.814310 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-smpqb" podUID="370cecdf-191e-4773-914f-eac3264601e4" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.814632 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-br6rk" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.883421 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.883572 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcdnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zvjqq_openshift-marketplace(888ada4e-9063-44ac-b293-8de842edb998): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.884763 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zvjqq" podUID="888ada4e-9063-44ac-b293-8de842edb998" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.887753 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.897409 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lg5bc_1759d33c-e10d-4db8-91aa-60a30ff68255/kube-multus-additional-cni-plugins/0.log" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.897478 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.908578 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.908693 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8c5z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-k8t6t_openshift-marketplace(d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.910743 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-k8t6t" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.913277 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc"] Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.913532 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerName="route-controller-manager" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.913547 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerName="route-controller-manager" Jan 05 13:50:40 crc kubenswrapper[4740]: E0105 13:50:40.913567 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.913574 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.913700 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" containerName="kube-multus-additional-cni-plugins" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.913714 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" containerName="route-controller-manager" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.914106 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.931372 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc"] Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975429 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3899b43b-6fe3-4a6d-9434-9a4754669370" path="/var/lib/kubelet/pods/3899b43b-6fe3-4a6d-9434-9a4754669370/volumes" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975711 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9kcq\" (UniqueName: \"kubernetes.io/projected/577f43f3-8470-4de3-ab3b-9934f1deab62-kube-api-access-w9kcq\") pod \"577f43f3-8470-4de3-ab3b-9934f1deab62\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975780 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvfcw\" (UniqueName: \"kubernetes.io/projected/1759d33c-e10d-4db8-91aa-60a30ff68255-kube-api-access-cvfcw\") pod \"1759d33c-e10d-4db8-91aa-60a30ff68255\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975829 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-client-ca\") pod \"577f43f3-8470-4de3-ab3b-9934f1deab62\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975853 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-config\") pod \"577f43f3-8470-4de3-ab3b-9934f1deab62\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975875 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1759d33c-e10d-4db8-91aa-60a30ff68255-ready\") pod \"1759d33c-e10d-4db8-91aa-60a30ff68255\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975916 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1759d33c-e10d-4db8-91aa-60a30ff68255-tuning-conf-dir\") pod \"1759d33c-e10d-4db8-91aa-60a30ff68255\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975935 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577f43f3-8470-4de3-ab3b-9934f1deab62-serving-cert\") pod \"577f43f3-8470-4de3-ab3b-9934f1deab62\" (UID: \"577f43f3-8470-4de3-ab3b-9934f1deab62\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.975970 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1759d33c-e10d-4db8-91aa-60a30ff68255-cni-sysctl-allowlist\") pod \"1759d33c-e10d-4db8-91aa-60a30ff68255\" (UID: \"1759d33c-e10d-4db8-91aa-60a30ff68255\") " Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.976148 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa7b746a-a66d-4b14-8184-b41729626e15-serving-cert\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.976175 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-config\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.976196 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzxp\" (UniqueName: \"kubernetes.io/projected/fa7b746a-a66d-4b14-8184-b41729626e15-kube-api-access-wvzxp\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.976271 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-client-ca\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.976285 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1759d33c-e10d-4db8-91aa-60a30ff68255-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "1759d33c-e10d-4db8-91aa-60a30ff68255" (UID: "1759d33c-e10d-4db8-91aa-60a30ff68255"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.977361 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1759d33c-e10d-4db8-91aa-60a30ff68255-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "1759d33c-e10d-4db8-91aa-60a30ff68255" (UID: "1759d33c-e10d-4db8-91aa-60a30ff68255"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.978363 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1759d33c-e10d-4db8-91aa-60a30ff68255-ready" (OuterVolumeSpecName: "ready") pod "1759d33c-e10d-4db8-91aa-60a30ff68255" (UID: "1759d33c-e10d-4db8-91aa-60a30ff68255"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.978395 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-config" (OuterVolumeSpecName: "config") pod "577f43f3-8470-4de3-ab3b-9934f1deab62" (UID: "577f43f3-8470-4de3-ab3b-9934f1deab62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.978832 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-client-ca" (OuterVolumeSpecName: "client-ca") pod "577f43f3-8470-4de3-ab3b-9934f1deab62" (UID: "577f43f3-8470-4de3-ab3b-9934f1deab62"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.981284 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1759d33c-e10d-4db8-91aa-60a30ff68255-kube-api-access-cvfcw" (OuterVolumeSpecName: "kube-api-access-cvfcw") pod "1759d33c-e10d-4db8-91aa-60a30ff68255" (UID: "1759d33c-e10d-4db8-91aa-60a30ff68255"). InnerVolumeSpecName "kube-api-access-cvfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.990280 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577f43f3-8470-4de3-ab3b-9934f1deab62-kube-api-access-w9kcq" (OuterVolumeSpecName: "kube-api-access-w9kcq") pod "577f43f3-8470-4de3-ab3b-9934f1deab62" (UID: "577f43f3-8470-4de3-ab3b-9934f1deab62"). InnerVolumeSpecName "kube-api-access-w9kcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:40 crc kubenswrapper[4740]: I0105 13:50:40.991032 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577f43f3-8470-4de3-ab3b-9934f1deab62-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "577f43f3-8470-4de3-ab3b-9934f1deab62" (UID: "577f43f3-8470-4de3-ab3b-9934f1deab62"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.031989 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69f74844c8-4rq4j"] Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.065307 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077173 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-client-ca\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077218 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa7b746a-a66d-4b14-8184-b41729626e15-serving-cert\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077244 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-config\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077271 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzxp\" (UniqueName: \"kubernetes.io/projected/fa7b746a-a66d-4b14-8184-b41729626e15-kube-api-access-wvzxp\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077323 4740 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1759d33c-e10d-4db8-91aa-60a30ff68255-ready\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077333 4740 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1759d33c-e10d-4db8-91aa-60a30ff68255-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077345 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/577f43f3-8470-4de3-ab3b-9934f1deab62-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077354 4740 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1759d33c-e10d-4db8-91aa-60a30ff68255-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077362 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9kcq\" (UniqueName: \"kubernetes.io/projected/577f43f3-8470-4de3-ab3b-9934f1deab62-kube-api-access-w9kcq\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077372 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvfcw\" (UniqueName: \"kubernetes.io/projected/1759d33c-e10d-4db8-91aa-60a30ff68255-kube-api-access-cvfcw\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077380 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.077389 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577f43f3-8470-4de3-ab3b-9934f1deab62-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.078302 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-client-ca\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.079385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-config\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.082607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa7b746a-a66d-4b14-8184-b41729626e15-serving-cert\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.092268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzxp\" (UniqueName: \"kubernetes.io/projected/fa7b746a-a66d-4b14-8184-b41729626e15-kube-api-access-wvzxp\") pod \"route-controller-manager-6bc97b9f78-lsjlc\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.254762 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.465041 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc"] Jan 05 13:50:41 crc kubenswrapper[4740]: W0105 13:50:41.475562 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa7b746a_a66d_4b14_8184_b41729626e15.slice/crio-79670929dc8c3e7ef53ca0f85775827f7516fb0471d25b1d57055a6596b96887 WatchSource:0}: Error finding container 79670929dc8c3e7ef53ca0f85775827f7516fb0471d25b1d57055a6596b96887: Status 404 returned error can't find the container with id 79670929dc8c3e7ef53ca0f85775827f7516fb0471d25b1d57055a6596b96887 Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.724527 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" event={"ID":"fa7b746a-a66d-4b14-8184-b41729626e15","Type":"ContainerStarted","Data":"a600f0112e54ed54caf350e5c7c40ad3cb207ee32062beedcbc8413069b3ffc5"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.724565 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" event={"ID":"fa7b746a-a66d-4b14-8184-b41729626e15","Type":"ContainerStarted","Data":"79670929dc8c3e7ef53ca0f85775827f7516fb0471d25b1d57055a6596b96887"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.725526 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.728145 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lg5bc_1759d33c-e10d-4db8-91aa-60a30ff68255/kube-multus-additional-cni-plugins/0.log" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.728206 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" event={"ID":"1759d33c-e10d-4db8-91aa-60a30ff68255","Type":"ContainerDied","Data":"393bd6d65b0a337b85cff9277b1fa9a0b6714ed5cdb7d84ec7da17e7f021db9a"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.728210 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lg5bc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.728228 4740 scope.go:117] "RemoveContainer" containerID="a9a331fc5362498a453078fd241e3c653595a267f6c23bde99eb37a18e75a179" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.729546 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" event={"ID":"4f2ba57f-6524-465b-aefc-bbd0f56accdc","Type":"ContainerStarted","Data":"d85c38d15bca080d003d56a9bf69d8f7261a1dd77b9c9539ea1b9245c0c995b1"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.729597 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" event={"ID":"4f2ba57f-6524-465b-aefc-bbd0f56accdc","Type":"ContainerStarted","Data":"3689143ed7da526e745d06523eedf0dcf7d07802f78cb4eceddb0aae71c8423d"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.729831 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.731945 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.732620 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475" event={"ID":"577f43f3-8470-4de3-ab3b-9934f1deab62","Type":"ContainerDied","Data":"569e7b64f94c4a85040b23ac071175a02175e65ef0170c9ec39d1264d520cc6f"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.734700 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.742388 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"48381fb8-5fa8-48c4-be6d-322df35570b1","Type":"ContainerStarted","Data":"c0a18b73573eb7f7c241dff986ea6594755cf8a0f430877fe34f4066be6d85c9"} Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.742473 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"48381fb8-5fa8-48c4-be6d-322df35570b1","Type":"ContainerStarted","Data":"dad000ea5fb7be25307b4af6cc6f1787cd5ac68d4c2c65a48228e9c7e03f19f8"} Jan 05 13:50:41 crc kubenswrapper[4740]: E0105 13:50:41.742651 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zvjqq" podUID="888ada4e-9063-44ac-b293-8de842edb998" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.744681 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" podStartSLOduration=7.744668511 podStartE2EDuration="7.744668511s" podCreationTimestamp="2026-01-05 13:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:41.74314148 +0000 UTC m=+91.050050079" watchObservedRunningTime="2026-01-05 13:50:41.744668511 +0000 UTC m=+91.051577090" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.752105 4740 scope.go:117] "RemoveContainer" containerID="20179e62d0bc0fe542f80a38a88a74790d63cf1372f5ac028d2d5381f64d3d85" Jan 05 13:50:41 crc kubenswrapper[4740]: E0105 13:50:41.752164 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-k8t6t" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.860897 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" podStartSLOduration=7.860873718 podStartE2EDuration="7.860873718s" podCreationTimestamp="2026-01-05 13:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:41.856222664 +0000 UTC m=+91.163131243" watchObservedRunningTime="2026-01-05 13:50:41.860873718 +0000 UTC m=+91.167782317" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.862911 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.862901903 podStartE2EDuration="1.862901903s" podCreationTimestamp="2026-01-05 13:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:41.84119824 +0000 UTC m=+91.148106819" watchObservedRunningTime="2026-01-05 13:50:41.862901903 +0000 UTC m=+91.169810482" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.879025 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lg5bc"] Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.885359 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lg5bc"] Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.901241 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475"] Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.906493 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bz475"] Jan 05 13:50:41 crc kubenswrapper[4740]: E0105 13:50:41.915596 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod48381fb8_5fa8_48c4_be6d_322df35570b1.slice/crio-c0a18b73573eb7f7c241dff986ea6594755cf8a0f430877fe34f4066be6d85c9.scope\": RecentStats: unable to find data in memory cache]" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.979552 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:50:41 crc kubenswrapper[4740]: I0105 13:50:41.989298 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 05 13:50:42 crc kubenswrapper[4740]: I0105 13:50:42.752344 4740 generic.go:334] "Generic (PLEG): container finished" podID="48381fb8-5fa8-48c4-be6d-322df35570b1" containerID="c0a18b73573eb7f7c241dff986ea6594755cf8a0f430877fe34f4066be6d85c9" exitCode=0 Jan 05 13:50:42 crc kubenswrapper[4740]: I0105 13:50:42.752410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"48381fb8-5fa8-48c4-be6d-322df35570b1","Type":"ContainerDied","Data":"c0a18b73573eb7f7c241dff986ea6594755cf8a0f430877fe34f4066be6d85c9"} Jan 05 13:50:42 crc kubenswrapper[4740]: I0105 13:50:42.806459 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.806427545 podStartE2EDuration="1.806427545s" podCreationTimestamp="2026-01-05 13:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:42.804192005 +0000 UTC m=+92.111100604" watchObservedRunningTime="2026-01-05 13:50:42.806427545 +0000 UTC m=+92.113336124" Jan 05 13:50:42 crc kubenswrapper[4740]: I0105 13:50:42.979985 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1759d33c-e10d-4db8-91aa-60a30ff68255" path="/var/lib/kubelet/pods/1759d33c-e10d-4db8-91aa-60a30ff68255/volumes" Jan 05 13:50:42 crc kubenswrapper[4740]: I0105 13:50:42.982663 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577f43f3-8470-4de3-ab3b-9934f1deab62" path="/var/lib/kubelet/pods/577f43f3-8470-4de3-ab3b-9934f1deab62/volumes" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.115777 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.220123 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48381fb8-5fa8-48c4-be6d-322df35570b1-kube-api-access\") pod \"48381fb8-5fa8-48c4-be6d-322df35570b1\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.220244 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48381fb8-5fa8-48c4-be6d-322df35570b1-kubelet-dir\") pod \"48381fb8-5fa8-48c4-be6d-322df35570b1\" (UID: \"48381fb8-5fa8-48c4-be6d-322df35570b1\") " Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.220425 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48381fb8-5fa8-48c4-be6d-322df35570b1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "48381fb8-5fa8-48c4-be6d-322df35570b1" (UID: "48381fb8-5fa8-48c4-be6d-322df35570b1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.220736 4740 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48381fb8-5fa8-48c4-be6d-322df35570b1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.243467 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48381fb8-5fa8-48c4-be6d-322df35570b1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "48381fb8-5fa8-48c4-be6d-322df35570b1" (UID: "48381fb8-5fa8-48c4-be6d-322df35570b1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.322057 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48381fb8-5fa8-48c4-be6d-322df35570b1-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.768519 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"48381fb8-5fa8-48c4-be6d-322df35570b1","Type":"ContainerDied","Data":"dad000ea5fb7be25307b4af6cc6f1787cd5ac68d4c2c65a48228e9c7e03f19f8"} Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.768981 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dad000ea5fb7be25307b4af6cc6f1787cd5ac68d4c2c65a48228e9c7e03f19f8" Jan 05 13:50:44 crc kubenswrapper[4740]: I0105 13:50:44.768577 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.136157 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 05 13:50:45 crc kubenswrapper[4740]: E0105 13:50:45.136505 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48381fb8-5fa8-48c4-be6d-322df35570b1" containerName="pruner" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.136523 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="48381fb8-5fa8-48c4-be6d-322df35570b1" containerName="pruner" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.136686 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="48381fb8-5fa8-48c4-be6d-322df35570b1" containerName="pruner" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.137333 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.140625 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.145941 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.146484 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.336743 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-var-lock\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.336810 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14e93303-35a5-401b-b6d0-49931cb458eb-kube-api-access\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.336861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-kubelet-dir\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.437938 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-kubelet-dir\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.438310 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14e93303-35a5-401b-b6d0-49931cb458eb-kube-api-access\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.438337 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-var-lock\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.438416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-var-lock\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.438042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-kubelet-dir\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.458699 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14e93303-35a5-401b-b6d0-49931cb458eb-kube-api-access\") pod \"installer-9-crc\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.460097 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:50:45 crc kubenswrapper[4740]: I0105 13:50:45.902111 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 05 13:50:46 crc kubenswrapper[4740]: I0105 13:50:46.089158 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 05 13:50:46 crc kubenswrapper[4740]: I0105 13:50:46.783028 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14e93303-35a5-401b-b6d0-49931cb458eb","Type":"ContainerStarted","Data":"194a48d4c18dd856a9f530a03a2bb131e5315d60d7fb8ab84dc1f2ab32f9f5e3"} Jan 05 13:50:46 crc kubenswrapper[4740]: I0105 13:50:46.783429 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14e93303-35a5-401b-b6d0-49931cb458eb","Type":"ContainerStarted","Data":"dd9c32b9602d80869b2d8c313691e59ced83d60a20430f5636394ea82f82676d"} Jan 05 13:50:47 crc kubenswrapper[4740]: I0105 13:50:47.791145 4740 generic.go:334] "Generic (PLEG): container finished" podID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerID="1343926588d4ec9530449e43fd7a5ef7ab1b02b73bf3092d174888c296d48ace" exitCode=0 Jan 05 13:50:47 crc kubenswrapper[4740]: I0105 13:50:47.791275 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkbs6" event={"ID":"8655819a-f2c2-4044-85e1-84f7e64cff21","Type":"ContainerDied","Data":"1343926588d4ec9530449e43fd7a5ef7ab1b02b73bf3092d174888c296d48ace"} Jan 05 13:50:47 crc kubenswrapper[4740]: I0105 13:50:47.810600 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.810577883 podStartE2EDuration="2.810577883s" podCreationTimestamp="2026-01-05 13:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:50:46.80255641 +0000 UTC m=+96.109464999" watchObservedRunningTime="2026-01-05 13:50:47.810577883 +0000 UTC m=+97.117486502" Jan 05 13:50:48 crc kubenswrapper[4740]: I0105 13:50:48.799168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkbs6" event={"ID":"8655819a-f2c2-4044-85e1-84f7e64cff21","Type":"ContainerStarted","Data":"0585f307a447b9f904aef33ae7e3c20a4886e3d7da436125939a6546cb4cddcf"} Jan 05 13:50:48 crc kubenswrapper[4740]: I0105 13:50:48.819654 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pkbs6" podStartSLOduration=2.68555106 podStartE2EDuration="49.819631484s" podCreationTimestamp="2026-01-05 13:49:59 +0000 UTC" firstStartedPulling="2026-01-05 13:50:01.162362714 +0000 UTC m=+50.469271283" lastFinishedPulling="2026-01-05 13:50:48.296443088 +0000 UTC m=+97.603351707" observedRunningTime="2026-01-05 13:50:48.816077278 +0000 UTC m=+98.122985847" watchObservedRunningTime="2026-01-05 13:50:48.819631484 +0000 UTC m=+98.126540083" Jan 05 13:50:50 crc kubenswrapper[4740]: I0105 13:50:50.295185 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:50 crc kubenswrapper[4740]: I0105 13:50:50.295256 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:50:51 crc kubenswrapper[4740]: I0105 13:50:51.359010 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pkbs6" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="registry-server" probeResult="failure" output=< Jan 05 13:50:51 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 13:50:51 crc kubenswrapper[4740]: > Jan 05 13:50:53 crc kubenswrapper[4740]: I0105 13:50:53.827160 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerStarted","Data":"4094f42065fc96102b08317e3ce818b7ab60ae0c070edc4659d83c15c34743fa"} Jan 05 13:50:53 crc kubenswrapper[4740]: I0105 13:50:53.829830 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerStarted","Data":"249d83df8b127c9f8e1e755b7878bd5e4e7d72ebf8eee64ddd97091e146375cf"} Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.842589 4740 generic.go:334] "Generic (PLEG): container finished" podID="f781df9d-cacf-433b-a588-35f67af41b66" containerID="4cb81e0177f084af7de803e89b0bf44142f04dbb87e762c882a6041783bcb9ac" exitCode=0 Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.842676 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerDied","Data":"4cb81e0177f084af7de803e89b0bf44142f04dbb87e762c882a6041783bcb9ac"} Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.844953 4740 generic.go:334] "Generic (PLEG): container finished" podID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerID="df9a7c210211634f95e70c8801542feff5c618fb0c1098733e232b91dda44146" exitCode=0 Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.844984 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwx8f" event={"ID":"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31","Type":"ContainerDied","Data":"df9a7c210211634f95e70c8801542feff5c618fb0c1098733e232b91dda44146"} Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.847786 4740 generic.go:334] "Generic (PLEG): container finished" podID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerID="249d83df8b127c9f8e1e755b7878bd5e4e7d72ebf8eee64ddd97091e146375cf" exitCode=0 Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.847875 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerDied","Data":"249d83df8b127c9f8e1e755b7878bd5e4e7d72ebf8eee64ddd97091e146375cf"} Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.852420 4740 generic.go:334] "Generic (PLEG): container finished" podID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerID="4094f42065fc96102b08317e3ce818b7ab60ae0c070edc4659d83c15c34743fa" exitCode=0 Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.852500 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerDied","Data":"4094f42065fc96102b08317e3ce818b7ab60ae0c070edc4659d83c15c34743fa"} Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.854409 4740 generic.go:334] "Generic (PLEG): container finished" podID="370cecdf-191e-4773-914f-eac3264601e4" containerID="d4d75d4eae3477be89ff1dc064666847cd1eca3304c1f7a7f4b5f32f1b10672b" exitCode=0 Jan 05 13:50:54 crc kubenswrapper[4740]: I0105 13:50:54.854440 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerDied","Data":"d4d75d4eae3477be89ff1dc064666847cd1eca3304c1f7a7f4b5f32f1b10672b"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.868425 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerStarted","Data":"df00718016e50606530aad457f5708decb29b963c713eb5b6fdee4880d91743c"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.871290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerStarted","Data":"80b2a567f7dfdf713c838fe9f1a9903af2d43798ca7360770bc43827754d966d"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.873403 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerStarted","Data":"18d42d3a9eeb556fe47f99d315a81d4f228c4911df27e09d166f5df681fb7194"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.875925 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerStarted","Data":"e3a912344d218d96d418ff0e53611e078e8fdcdcd084f47f701a809492268270"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.879077 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwx8f" event={"ID":"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31","Type":"ContainerStarted","Data":"13930a96314f1a3628fde07f9748dd98b0f212ac764c85009029115f0345c56f"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.880843 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerStarted","Data":"e8ffe2cc2e736cc7321dc8730c5bff5e77567a33f36365529087bd6a93a96a80"} Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.892226 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-br6rk" podStartSLOduration=2.758518791 podStartE2EDuration="54.892209382s" podCreationTimestamp="2026-01-05 13:50:01 +0000 UTC" firstStartedPulling="2026-01-05 13:50:03.271743782 +0000 UTC m=+52.578652361" lastFinishedPulling="2026-01-05 13:50:55.405434363 +0000 UTC m=+104.712342952" observedRunningTime="2026-01-05 13:50:55.891415741 +0000 UTC m=+105.198324340" watchObservedRunningTime="2026-01-05 13:50:55.892209382 +0000 UTC m=+105.199117961" Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.936122 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-smpqb" podStartSLOduration=1.909980368 podStartE2EDuration="53.93609954s" podCreationTimestamp="2026-01-05 13:50:02 +0000 UTC" firstStartedPulling="2026-01-05 13:50:03.268669743 +0000 UTC m=+52.575578322" lastFinishedPulling="2026-01-05 13:50:55.294788885 +0000 UTC m=+104.601697494" observedRunningTime="2026-01-05 13:50:55.918552179 +0000 UTC m=+105.225460758" watchObservedRunningTime="2026-01-05 13:50:55.93609954 +0000 UTC m=+105.243008109" Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.939544 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwx8f" podStartSLOduration=2.921071062 podStartE2EDuration="52.939538542s" podCreationTimestamp="2026-01-05 13:50:03 +0000 UTC" firstStartedPulling="2026-01-05 13:50:05.324292052 +0000 UTC m=+54.631200621" lastFinishedPulling="2026-01-05 13:50:55.342759482 +0000 UTC m=+104.649668101" observedRunningTime="2026-01-05 13:50:55.931917207 +0000 UTC m=+105.238825786" watchObservedRunningTime="2026-01-05 13:50:55.939538542 +0000 UTC m=+105.246447121" Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.972764 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hfzkk" podStartSLOduration=1.6759445880000001 podStartE2EDuration="55.972740732s" podCreationTimestamp="2026-01-05 13:50:00 +0000 UTC" firstStartedPulling="2026-01-05 13:50:01.16681283 +0000 UTC m=+50.473721439" lastFinishedPulling="2026-01-05 13:50:55.463608994 +0000 UTC m=+104.770517583" observedRunningTime="2026-01-05 13:50:55.970169653 +0000 UTC m=+105.277078232" watchObservedRunningTime="2026-01-05 13:50:55.972740732 +0000 UTC m=+105.279649321" Jan 05 13:50:55 crc kubenswrapper[4740]: I0105 13:50:55.993716 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-scltx" podStartSLOduration=2.676205423 podStartE2EDuration="53.993699464s" podCreationTimestamp="2026-01-05 13:50:02 +0000 UTC" firstStartedPulling="2026-01-05 13:50:04.291672628 +0000 UTC m=+53.598581207" lastFinishedPulling="2026-01-05 13:50:55.609166669 +0000 UTC m=+104.916075248" observedRunningTime="2026-01-05 13:50:55.989445971 +0000 UTC m=+105.296354560" watchObservedRunningTime="2026-01-05 13:50:55.993699464 +0000 UTC m=+105.300608043" Jan 05 13:50:56 crc kubenswrapper[4740]: I0105 13:50:56.887309 4740 generic.go:334] "Generic (PLEG): container finished" podID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerID="e8ffe2cc2e736cc7321dc8730c5bff5e77567a33f36365529087bd6a93a96a80" exitCode=0 Jan 05 13:50:56 crc kubenswrapper[4740]: I0105 13:50:56.887355 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerDied","Data":"e8ffe2cc2e736cc7321dc8730c5bff5e77567a33f36365529087bd6a93a96a80"} Jan 05 13:50:57 crc kubenswrapper[4740]: I0105 13:50:57.908186 4740 generic.go:334] "Generic (PLEG): container finished" podID="888ada4e-9063-44ac-b293-8de842edb998" containerID="623588afa9b68b9df907a942cd8dafb5c6b24a477f1eb34b85edd4371cd39fa5" exitCode=0 Jan 05 13:50:57 crc kubenswrapper[4740]: I0105 13:50:57.908316 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjqq" event={"ID":"888ada4e-9063-44ac-b293-8de842edb998","Type":"ContainerDied","Data":"623588afa9b68b9df907a942cd8dafb5c6b24a477f1eb34b85edd4371cd39fa5"} Jan 05 13:50:57 crc kubenswrapper[4740]: I0105 13:50:57.912410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerStarted","Data":"693a1594e8ad2aed0edca20c218625b41b6313f4fe8a1d011abc15898bb6a768"} Jan 05 13:50:57 crc kubenswrapper[4740]: I0105 13:50:57.940044 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k8t6t" podStartSLOduration=2.36480715 podStartE2EDuration="58.940022999s" podCreationTimestamp="2026-01-05 13:49:59 +0000 UTC" firstStartedPulling="2026-01-05 13:50:01.192528209 +0000 UTC m=+50.499436788" lastFinishedPulling="2026-01-05 13:50:57.767744058 +0000 UTC m=+107.074652637" observedRunningTime="2026-01-05 13:50:57.938274003 +0000 UTC m=+107.245182592" watchObservedRunningTime="2026-01-05 13:50:57.940022999 +0000 UTC m=+107.246931588" Jan 05 13:50:58 crc kubenswrapper[4740]: I0105 13:50:58.918750 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjqq" event={"ID":"888ada4e-9063-44ac-b293-8de842edb998","Type":"ContainerStarted","Data":"bc968c1f38b38fe7b36bc49197503cc91cb4e12722792903a3a581b21307bc17"} Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.100891 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.101125 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.290683 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.312946 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zvjqq" podStartSLOduration=2.773170393 podStartE2EDuration="1m0.312928428s" podCreationTimestamp="2026-01-05 13:50:00 +0000 UTC" firstStartedPulling="2026-01-05 13:50:01.182263143 +0000 UTC m=+50.489171732" lastFinishedPulling="2026-01-05 13:50:58.722021148 +0000 UTC m=+108.028929767" observedRunningTime="2026-01-05 13:50:59.94584859 +0000 UTC m=+109.252757179" watchObservedRunningTime="2026-01-05 13:51:00.312928428 +0000 UTC m=+109.619837007" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.341793 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.394460 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.508422 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.508493 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.766341 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.767054 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.819002 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:51:00 crc kubenswrapper[4740]: I0105 13:51:00.984324 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:51:01 crc kubenswrapper[4740]: I0105 13:51:01.543101 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zvjqq" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="registry-server" probeResult="failure" output=< Jan 05 13:51:01 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 13:51:01 crc kubenswrapper[4740]: > Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.103826 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.103907 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.170152 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.507604 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.507677 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.580790 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:51:02 crc kubenswrapper[4740]: I0105 13:51:02.993050 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.007674 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.359641 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.359690 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.400782 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hfzkk"] Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.420579 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.761891 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.761972 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.820180 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:51:03 crc kubenswrapper[4740]: I0105 13:51:03.949234 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hfzkk" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="registry-server" containerID="cri-o://e3a912344d218d96d418ff0e53611e078e8fdcdcd084f47f701a809492268270" gracePeriod=2 Jan 05 13:51:04 crc kubenswrapper[4740]: I0105 13:51:04.010107 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:51:04 crc kubenswrapper[4740]: I0105 13:51:04.012144 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:51:04 crc kubenswrapper[4740]: I0105 13:51:04.958547 4740 generic.go:334] "Generic (PLEG): container finished" podID="f781df9d-cacf-433b-a588-35f67af41b66" containerID="e3a912344d218d96d418ff0e53611e078e8fdcdcd084f47f701a809492268270" exitCode=0 Jan 05 13:51:04 crc kubenswrapper[4740]: I0105 13:51:04.958711 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerDied","Data":"e3a912344d218d96d418ff0e53611e078e8fdcdcd084f47f701a809492268270"} Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.198766 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-smpqb"] Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.199048 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-smpqb" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="registry-server" containerID="cri-o://18d42d3a9eeb556fe47f99d315a81d4f228c4911df27e09d166f5df681fb7194" gracePeriod=2 Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.670697 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.812038 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l8db\" (UniqueName: \"kubernetes.io/projected/f781df9d-cacf-433b-a588-35f67af41b66-kube-api-access-2l8db\") pod \"f781df9d-cacf-433b-a588-35f67af41b66\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.812176 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-catalog-content\") pod \"f781df9d-cacf-433b-a588-35f67af41b66\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.812213 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-utilities\") pod \"f781df9d-cacf-433b-a588-35f67af41b66\" (UID: \"f781df9d-cacf-433b-a588-35f67af41b66\") " Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.813355 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-utilities" (OuterVolumeSpecName: "utilities") pod "f781df9d-cacf-433b-a588-35f67af41b66" (UID: "f781df9d-cacf-433b-a588-35f67af41b66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.818017 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f781df9d-cacf-433b-a588-35f67af41b66-kube-api-access-2l8db" (OuterVolumeSpecName: "kube-api-access-2l8db") pod "f781df9d-cacf-433b-a588-35f67af41b66" (UID: "f781df9d-cacf-433b-a588-35f67af41b66"). InnerVolumeSpecName "kube-api-access-2l8db". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.913950 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l8db\" (UniqueName: \"kubernetes.io/projected/f781df9d-cacf-433b-a588-35f67af41b66-kube-api-access-2l8db\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.913975 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.967023 4740 generic.go:334] "Generic (PLEG): container finished" podID="370cecdf-191e-4773-914f-eac3264601e4" containerID="18d42d3a9eeb556fe47f99d315a81d4f228c4911df27e09d166f5df681fb7194" exitCode=0 Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.967058 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerDied","Data":"18d42d3a9eeb556fe47f99d315a81d4f228c4911df27e09d166f5df681fb7194"} Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.969033 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfzkk" event={"ID":"f781df9d-cacf-433b-a588-35f67af41b66","Type":"ContainerDied","Data":"8701e5af45eb2ab53fc58cce5fbb314c972dff346f2be237dfa938856f3949aa"} Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.969087 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfzkk" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.969105 4740 scope.go:117] "RemoveContainer" containerID="e3a912344d218d96d418ff0e53611e078e8fdcdcd084f47f701a809492268270" Jan 05 13:51:05 crc kubenswrapper[4740]: I0105 13:51:05.991962 4740 scope.go:117] "RemoveContainer" containerID="4cb81e0177f084af7de803e89b0bf44142f04dbb87e762c882a6041783bcb9ac" Jan 05 13:51:06 crc kubenswrapper[4740]: I0105 13:51:06.014899 4740 scope.go:117] "RemoveContainer" containerID="5c5079391f83cf46ba0ddce6c18ddce4a9d5076d15b7e1cbe564018cafec9e62" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.092603 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.231786 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-catalog-content\") pod \"370cecdf-191e-4773-914f-eac3264601e4\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.232141 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq8t8\" (UniqueName: \"kubernetes.io/projected/370cecdf-191e-4773-914f-eac3264601e4-kube-api-access-cq8t8\") pod \"370cecdf-191e-4773-914f-eac3264601e4\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.232270 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-utilities\") pod \"370cecdf-191e-4773-914f-eac3264601e4\" (UID: \"370cecdf-191e-4773-914f-eac3264601e4\") " Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.232968 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-utilities" (OuterVolumeSpecName: "utilities") pod "370cecdf-191e-4773-914f-eac3264601e4" (UID: "370cecdf-191e-4773-914f-eac3264601e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.236759 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370cecdf-191e-4773-914f-eac3264601e4-kube-api-access-cq8t8" (OuterVolumeSpecName: "kube-api-access-cq8t8") pod "370cecdf-191e-4773-914f-eac3264601e4" (UID: "370cecdf-191e-4773-914f-eac3264601e4"). InnerVolumeSpecName "kube-api-access-cq8t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.273853 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "370cecdf-191e-4773-914f-eac3264601e4" (UID: "370cecdf-191e-4773-914f-eac3264601e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.276134 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f781df9d-cacf-433b-a588-35f67af41b66" (UID: "f781df9d-cacf-433b-a588-35f67af41b66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.334359 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.334399 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f781df9d-cacf-433b-a588-35f67af41b66-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.334412 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq8t8\" (UniqueName: \"kubernetes.io/projected/370cecdf-191e-4773-914f-eac3264601e4-kube-api-access-cq8t8\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.334426 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370cecdf-191e-4773-914f-eac3264601e4-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.509151 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hfzkk"] Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.512593 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hfzkk"] Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.605610 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwx8f"] Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.605965 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwx8f" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="registry-server" containerID="cri-o://13930a96314f1a3628fde07f9748dd98b0f212ac764c85009029115f0345c56f" gracePeriod=2 Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.988771 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-smpqb" event={"ID":"370cecdf-191e-4773-914f-eac3264601e4","Type":"ContainerDied","Data":"10c26ef3179558ac98a2671e5ca614258c10f86e6b117b91d3eb72a0227e6821"} Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.988870 4740 scope.go:117] "RemoveContainer" containerID="18d42d3a9eeb556fe47f99d315a81d4f228c4911df27e09d166f5df681fb7194" Jan 05 13:51:07 crc kubenswrapper[4740]: I0105 13:51:07.988874 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-smpqb" Jan 05 13:51:08 crc kubenswrapper[4740]: I0105 13:51:08.008909 4740 scope.go:117] "RemoveContainer" containerID="d4d75d4eae3477be89ff1dc064666847cd1eca3304c1f7a7f4b5f32f1b10672b" Jan 05 13:51:08 crc kubenswrapper[4740]: I0105 13:51:08.031734 4740 scope.go:117] "RemoveContainer" containerID="2831345f81928149114d5110dcaf5dbeb2040ffc393e4c5ce74075ea4def5480" Jan 05 13:51:08 crc kubenswrapper[4740]: I0105 13:51:08.043928 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-smpqb"] Jan 05 13:51:08 crc kubenswrapper[4740]: I0105 13:51:08.049033 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-smpqb"] Jan 05 13:51:08 crc kubenswrapper[4740]: I0105 13:51:08.977752 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370cecdf-191e-4773-914f-eac3264601e4" path="/var/lib/kubelet/pods/370cecdf-191e-4773-914f-eac3264601e4/volumes" Jan 05 13:51:08 crc kubenswrapper[4740]: I0105 13:51:08.979487 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f781df9d-cacf-433b-a588-35f67af41b66" path="/var/lib/kubelet/pods/f781df9d-cacf-433b-a588-35f67af41b66/volumes" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.003292 4740 generic.go:334] "Generic (PLEG): container finished" podID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerID="13930a96314f1a3628fde07f9748dd98b0f212ac764c85009029115f0345c56f" exitCode=0 Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.003438 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwx8f" event={"ID":"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31","Type":"ContainerDied","Data":"13930a96314f1a3628fde07f9748dd98b0f212ac764c85009029115f0345c56f"} Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.157823 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.175855 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.271812 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-catalog-content\") pod \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.271883 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-utilities\") pod \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.271911 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf4rb\" (UniqueName: \"kubernetes.io/projected/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-kube-api-access-nf4rb\") pod \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\" (UID: \"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31\") " Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.272749 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-utilities" (OuterVolumeSpecName: "utilities") pod "6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" (UID: "6a6ab9f1-e8a1-4aac-84ea-059414ff1e31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.276668 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-kube-api-access-nf4rb" (OuterVolumeSpecName: "kube-api-access-nf4rb") pod "6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" (UID: "6a6ab9f1-e8a1-4aac-84ea-059414ff1e31"). InnerVolumeSpecName "kube-api-access-nf4rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.373604 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.373631 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf4rb\" (UniqueName: \"kubernetes.io/projected/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-kube-api-access-nf4rb\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.385893 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" (UID: "6a6ab9f1-e8a1-4aac-84ea-059414ff1e31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.475466 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.551174 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:51:10 crc kubenswrapper[4740]: I0105 13:51:10.597205 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.018287 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwx8f" Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.018293 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwx8f" event={"ID":"6a6ab9f1-e8a1-4aac-84ea-059414ff1e31","Type":"ContainerDied","Data":"0e7fa24def0d99b2bba2cc3a47e74b93ab067b924f07823a98e5e26670fb0800"} Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.018908 4740 scope.go:117] "RemoveContainer" containerID="13930a96314f1a3628fde07f9748dd98b0f212ac764c85009029115f0345c56f" Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.042015 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwx8f"] Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.047896 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwx8f"] Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.053525 4740 scope.go:117] "RemoveContainer" containerID="df9a7c210211634f95e70c8801542feff5c618fb0c1098733e232b91dda44146" Jan 05 13:51:11 crc kubenswrapper[4740]: I0105 13:51:11.081858 4740 scope.go:117] "RemoveContainer" containerID="1555cca65e0ba43b9ec762bb13cf2a5879339f5a1aaaec3f9c78fe3c10bb2319" Jan 05 13:51:12 crc kubenswrapper[4740]: I0105 13:51:12.976089 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" path="/var/lib/kubelet/pods/6a6ab9f1-e8a1-4aac-84ea-059414ff1e31/volumes" Jan 05 13:51:14 crc kubenswrapper[4740]: I0105 13:51:14.005551 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zvjqq"] Jan 05 13:51:14 crc kubenswrapper[4740]: I0105 13:51:14.006750 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zvjqq" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="registry-server" containerID="cri-o://bc968c1f38b38fe7b36bc49197503cc91cb4e12722792903a3a581b21307bc17" gracePeriod=2 Jan 05 13:51:14 crc kubenswrapper[4740]: I0105 13:51:14.505803 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69f74844c8-4rq4j"] Jan 05 13:51:14 crc kubenswrapper[4740]: I0105 13:51:14.506394 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" podUID="4f2ba57f-6524-465b-aefc-bbd0f56accdc" containerName="controller-manager" containerID="cri-o://d85c38d15bca080d003d56a9bf69d8f7261a1dd77b9c9539ea1b9245c0c995b1" gracePeriod=30 Jan 05 13:51:14 crc kubenswrapper[4740]: I0105 13:51:14.603949 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc"] Jan 05 13:51:14 crc kubenswrapper[4740]: I0105 13:51:14.604158 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" podUID="fa7b746a-a66d-4b14-8184-b41729626e15" containerName="route-controller-manager" containerID="cri-o://a600f0112e54ed54caf350e5c7c40ad3cb207ee32062beedcbc8413069b3ffc5" gracePeriod=30 Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.040700 4740 generic.go:334] "Generic (PLEG): container finished" podID="888ada4e-9063-44ac-b293-8de842edb998" containerID="bc968c1f38b38fe7b36bc49197503cc91cb4e12722792903a3a581b21307bc17" exitCode=0 Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.040752 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjqq" event={"ID":"888ada4e-9063-44ac-b293-8de842edb998","Type":"ContainerDied","Data":"bc968c1f38b38fe7b36bc49197503cc91cb4e12722792903a3a581b21307bc17"} Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.041772 4740 generic.go:334] "Generic (PLEG): container finished" podID="fa7b746a-a66d-4b14-8184-b41729626e15" containerID="a600f0112e54ed54caf350e5c7c40ad3cb207ee32062beedcbc8413069b3ffc5" exitCode=0 Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.041811 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" event={"ID":"fa7b746a-a66d-4b14-8184-b41729626e15","Type":"ContainerDied","Data":"a600f0112e54ed54caf350e5c7c40ad3cb207ee32062beedcbc8413069b3ffc5"} Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.043185 4740 generic.go:334] "Generic (PLEG): container finished" podID="4f2ba57f-6524-465b-aefc-bbd0f56accdc" containerID="d85c38d15bca080d003d56a9bf69d8f7261a1dd77b9c9539ea1b9245c0c995b1" exitCode=0 Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.043227 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" event={"ID":"4f2ba57f-6524-465b-aefc-bbd0f56accdc","Type":"ContainerDied","Data":"d85c38d15bca080d003d56a9bf69d8f7261a1dd77b9c9539ea1b9245c0c995b1"} Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.455989 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.524635 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.535743 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639159 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzxp\" (UniqueName: \"kubernetes.io/projected/fa7b746a-a66d-4b14-8184-b41729626e15-kube-api-access-wvzxp\") pod \"fa7b746a-a66d-4b14-8184-b41729626e15\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639231 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-utilities\") pod \"888ada4e-9063-44ac-b293-8de842edb998\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639279 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2ba57f-6524-465b-aefc-bbd0f56accdc-serving-cert\") pod \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639304 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r49xt\" (UniqueName: \"kubernetes.io/projected/4f2ba57f-6524-465b-aefc-bbd0f56accdc-kube-api-access-r49xt\") pod \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639330 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-config\") pod \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639356 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-client-ca\") pod \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639382 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-catalog-content\") pod \"888ada4e-9063-44ac-b293-8de842edb998\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639414 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-proxy-ca-bundles\") pod \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\" (UID: \"4f2ba57f-6524-465b-aefc-bbd0f56accdc\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639473 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-config\") pod \"fa7b746a-a66d-4b14-8184-b41729626e15\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639510 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa7b746a-a66d-4b14-8184-b41729626e15-serving-cert\") pod \"fa7b746a-a66d-4b14-8184-b41729626e15\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639532 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-client-ca\") pod \"fa7b746a-a66d-4b14-8184-b41729626e15\" (UID: \"fa7b746a-a66d-4b14-8184-b41729626e15\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.639563 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcdnr\" (UniqueName: \"kubernetes.io/projected/888ada4e-9063-44ac-b293-8de842edb998-kube-api-access-pcdnr\") pod \"888ada4e-9063-44ac-b293-8de842edb998\" (UID: \"888ada4e-9063-44ac-b293-8de842edb998\") " Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.640302 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-utilities" (OuterVolumeSpecName: "utilities") pod "888ada4e-9063-44ac-b293-8de842edb998" (UID: "888ada4e-9063-44ac-b293-8de842edb998"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.640523 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-config" (OuterVolumeSpecName: "config") pod "fa7b746a-a66d-4b14-8184-b41729626e15" (UID: "fa7b746a-a66d-4b14-8184-b41729626e15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.640723 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4f2ba57f-6524-465b-aefc-bbd0f56accdc" (UID: "4f2ba57f-6524-465b-aefc-bbd0f56accdc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.641101 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-config" (OuterVolumeSpecName: "config") pod "4f2ba57f-6524-465b-aefc-bbd0f56accdc" (UID: "4f2ba57f-6524-465b-aefc-bbd0f56accdc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.641330 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-client-ca" (OuterVolumeSpecName: "client-ca") pod "fa7b746a-a66d-4b14-8184-b41729626e15" (UID: "fa7b746a-a66d-4b14-8184-b41729626e15"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.641536 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-client-ca" (OuterVolumeSpecName: "client-ca") pod "4f2ba57f-6524-465b-aefc-bbd0f56accdc" (UID: "4f2ba57f-6524-465b-aefc-bbd0f56accdc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.644264 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2ba57f-6524-465b-aefc-bbd0f56accdc-kube-api-access-r49xt" (OuterVolumeSpecName: "kube-api-access-r49xt") pod "4f2ba57f-6524-465b-aefc-bbd0f56accdc" (UID: "4f2ba57f-6524-465b-aefc-bbd0f56accdc"). InnerVolumeSpecName "kube-api-access-r49xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.644316 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888ada4e-9063-44ac-b293-8de842edb998-kube-api-access-pcdnr" (OuterVolumeSpecName: "kube-api-access-pcdnr") pod "888ada4e-9063-44ac-b293-8de842edb998" (UID: "888ada4e-9063-44ac-b293-8de842edb998"). InnerVolumeSpecName "kube-api-access-pcdnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.644337 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2ba57f-6524-465b-aefc-bbd0f56accdc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4f2ba57f-6524-465b-aefc-bbd0f56accdc" (UID: "4f2ba57f-6524-465b-aefc-bbd0f56accdc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.644719 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7b746a-a66d-4b14-8184-b41729626e15-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fa7b746a-a66d-4b14-8184-b41729626e15" (UID: "fa7b746a-a66d-4b14-8184-b41729626e15"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.650763 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7b746a-a66d-4b14-8184-b41729626e15-kube-api-access-wvzxp" (OuterVolumeSpecName: "kube-api-access-wvzxp") pod "fa7b746a-a66d-4b14-8184-b41729626e15" (UID: "fa7b746a-a66d-4b14-8184-b41729626e15"). InnerVolumeSpecName "kube-api-access-wvzxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.687301 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "888ada4e-9063-44ac-b293-8de842edb998" (UID: "888ada4e-9063-44ac-b293-8de842edb998"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740754 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740785 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa7b746a-a66d-4b14-8184-b41729626e15-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740795 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa7b746a-a66d-4b14-8184-b41729626e15-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740804 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcdnr\" (UniqueName: \"kubernetes.io/projected/888ada4e-9063-44ac-b293-8de842edb998-kube-api-access-pcdnr\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740816 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvzxp\" (UniqueName: \"kubernetes.io/projected/fa7b746a-a66d-4b14-8184-b41729626e15-kube-api-access-wvzxp\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740824 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740831 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2ba57f-6524-465b-aefc-bbd0f56accdc-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740839 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r49xt\" (UniqueName: \"kubernetes.io/projected/4f2ba57f-6524-465b-aefc-bbd0f56accdc-kube-api-access-r49xt\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740847 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740856 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740863 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888ada4e-9063-44ac-b293-8de842edb998-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:15 crc kubenswrapper[4740]: I0105 13:51:15.740872 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f2ba57f-6524-465b-aefc-bbd0f56accdc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.050027 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" event={"ID":"4f2ba57f-6524-465b-aefc-bbd0f56accdc","Type":"ContainerDied","Data":"3689143ed7da526e745d06523eedf0dcf7d07802f78cb4eceddb0aae71c8423d"} Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.050054 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69f74844c8-4rq4j" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.050448 4740 scope.go:117] "RemoveContainer" containerID="d85c38d15bca080d003d56a9bf69d8f7261a1dd77b9c9539ea1b9245c0c995b1" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.052740 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjqq" event={"ID":"888ada4e-9063-44ac-b293-8de842edb998","Type":"ContainerDied","Data":"efc8661ab01d118652dbf546c4dcb0ff87250da3c1fb90ff4438c03d4ca6e65d"} Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.052751 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjqq" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.054132 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" event={"ID":"fa7b746a-a66d-4b14-8184-b41729626e15","Type":"ContainerDied","Data":"79670929dc8c3e7ef53ca0f85775827f7516fb0471d25b1d57055a6596b96887"} Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.054216 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.067187 4740 scope.go:117] "RemoveContainer" containerID="bc968c1f38b38fe7b36bc49197503cc91cb4e12722792903a3a581b21307bc17" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.086398 4740 scope.go:117] "RemoveContainer" containerID="623588afa9b68b9df907a942cd8dafb5c6b24a477f1eb34b85edd4371cd39fa5" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.106286 4740 scope.go:117] "RemoveContainer" containerID="b631471044aee78f3aff65f38298ad87bfc01cb238f28d24a2e6aad93e63e294" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.114557 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zvjqq"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.118583 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zvjqq"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.127751 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69f74844c8-4rq4j"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.127889 4740 scope.go:117] "RemoveContainer" containerID="a600f0112e54ed54caf350e5c7c40ad3cb207ee32062beedcbc8413069b3ffc5" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.131756 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69f74844c8-4rq4j"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.142502 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.145220 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bc97b9f78-lsjlc"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389220 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-547b88d74f-q6php"] Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389465 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389477 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389487 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389492 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389502 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389507 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389515 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389521 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389531 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389538 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389546 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389553 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389562 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2ba57f-6524-465b-aefc-bbd0f56accdc" containerName="controller-manager" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389568 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2ba57f-6524-465b-aefc-bbd0f56accdc" containerName="controller-manager" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389575 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7b746a-a66d-4b14-8184-b41729626e15" containerName="route-controller-manager" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389581 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7b746a-a66d-4b14-8184-b41729626e15" containerName="route-controller-manager" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389587 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389593 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389600 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389606 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389612 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389618 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389627 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389632 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389638 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389644 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="extract-content" Jan 05 13:51:16 crc kubenswrapper[4740]: E0105 13:51:16.389652 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389658 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="extract-utilities" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389745 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6ab9f1-e8a1-4aac-84ea-059414ff1e31" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389754 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2ba57f-6524-465b-aefc-bbd0f56accdc" containerName="controller-manager" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389762 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f781df9d-cacf-433b-a588-35f67af41b66" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389773 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="370cecdf-191e-4773-914f-eac3264601e4" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389780 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7b746a-a66d-4b14-8184-b41729626e15" containerName="route-controller-manager" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.389790 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="888ada4e-9063-44ac-b293-8de842edb998" containerName="registry-server" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.390191 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.392714 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.392748 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.392995 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.393964 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.394044 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.394126 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.394707 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.394799 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.399179 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.399210 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.399250 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.399443 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.399542 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.400402 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.403880 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.406565 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.411651 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-547b88d74f-q6php"] Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.550942 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-proxy-ca-bundles\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.550979 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg6cr\" (UniqueName: \"kubernetes.io/projected/f711b0b3-8883-4012-8cbc-ef337e8c074d-kube-api-access-zg6cr\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551004 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-config\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551022 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab275941-b160-4fac-9026-dc7476466272-serving-cert\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551049 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f711b0b3-8883-4012-8cbc-ef337e8c074d-serving-cert\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551085 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-client-ca\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551129 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqgk\" (UniqueName: \"kubernetes.io/projected/ab275941-b160-4fac-9026-dc7476466272-kube-api-access-kgqgk\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551149 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-config\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.551165 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-client-ca\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652055 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqgk\" (UniqueName: \"kubernetes.io/projected/ab275941-b160-4fac-9026-dc7476466272-kube-api-access-kgqgk\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652147 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-config\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652171 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-client-ca\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652198 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-proxy-ca-bundles\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg6cr\" (UniqueName: \"kubernetes.io/projected/f711b0b3-8883-4012-8cbc-ef337e8c074d-kube-api-access-zg6cr\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-config\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652254 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab275941-b160-4fac-9026-dc7476466272-serving-cert\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652281 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f711b0b3-8883-4012-8cbc-ef337e8c074d-serving-cert\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.652300 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-client-ca\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.653390 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-client-ca\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.654168 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-client-ca\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.654211 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-config\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.655394 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-config\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.655592 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-proxy-ca-bundles\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.670710 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f711b0b3-8883-4012-8cbc-ef337e8c074d-serving-cert\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.674817 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab275941-b160-4fac-9026-dc7476466272-serving-cert\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.682477 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg6cr\" (UniqueName: \"kubernetes.io/projected/f711b0b3-8883-4012-8cbc-ef337e8c074d-kube-api-access-zg6cr\") pod \"controller-manager-547b88d74f-q6php\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.682821 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqgk\" (UniqueName: \"kubernetes.io/projected/ab275941-b160-4fac-9026-dc7476466272-kube-api-access-kgqgk\") pod \"route-controller-manager-5d6bb8bd5f-cspx7\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.733932 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.741223 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.958209 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7"] Jan 05 13:51:16 crc kubenswrapper[4740]: W0105 13:51:16.965116 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab275941_b160_4fac_9026_dc7476466272.slice/crio-6328977b655a260b39d474a94fbef4430707020702c2f07f72522f70671b47c1 WatchSource:0}: Error finding container 6328977b655a260b39d474a94fbef4430707020702c2f07f72522f70671b47c1: Status 404 returned error can't find the container with id 6328977b655a260b39d474a94fbef4430707020702c2f07f72522f70671b47c1 Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.973514 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2ba57f-6524-465b-aefc-bbd0f56accdc" path="/var/lib/kubelet/pods/4f2ba57f-6524-465b-aefc-bbd0f56accdc/volumes" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.974244 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888ada4e-9063-44ac-b293-8de842edb998" path="/var/lib/kubelet/pods/888ada4e-9063-44ac-b293-8de842edb998/volumes" Jan 05 13:51:16 crc kubenswrapper[4740]: I0105 13:51:16.974915 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7b746a-a66d-4b14-8184-b41729626e15" path="/var/lib/kubelet/pods/fa7b746a-a66d-4b14-8184-b41729626e15/volumes" Jan 05 13:51:17 crc kubenswrapper[4740]: I0105 13:51:17.023322 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-547b88d74f-q6php"] Jan 05 13:51:17 crc kubenswrapper[4740]: W0105 13:51:17.046648 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf711b0b3_8883_4012_8cbc_ef337e8c074d.slice/crio-843d76e1346b98c8c870a0e67432509ed4b5e00c2827c7cb65ca9d1eee67711e WatchSource:0}: Error finding container 843d76e1346b98c8c870a0e67432509ed4b5e00c2827c7cb65ca9d1eee67711e: Status 404 returned error can't find the container with id 843d76e1346b98c8c870a0e67432509ed4b5e00c2827c7cb65ca9d1eee67711e Jan 05 13:51:17 crc kubenswrapper[4740]: I0105 13:51:17.063289 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" event={"ID":"ab275941-b160-4fac-9026-dc7476466272","Type":"ContainerStarted","Data":"6328977b655a260b39d474a94fbef4430707020702c2f07f72522f70671b47c1"} Jan 05 13:51:17 crc kubenswrapper[4740]: I0105 13:51:17.069532 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" event={"ID":"f711b0b3-8883-4012-8cbc-ef337e8c074d","Type":"ContainerStarted","Data":"843d76e1346b98c8c870a0e67432509ed4b5e00c2827c7cb65ca9d1eee67711e"} Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.081480 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" event={"ID":"f711b0b3-8883-4012-8cbc-ef337e8c074d","Type":"ContainerStarted","Data":"4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc"} Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.082620 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.083126 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" event={"ID":"ab275941-b160-4fac-9026-dc7476466272","Type":"ContainerStarted","Data":"e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee"} Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.083627 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.087396 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.088259 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.104753 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" podStartSLOduration=4.104726335 podStartE2EDuration="4.104726335s" podCreationTimestamp="2026-01-05 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:51:18.097965604 +0000 UTC m=+127.404874203" watchObservedRunningTime="2026-01-05 13:51:18.104726335 +0000 UTC m=+127.411634914" Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.139028 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" podStartSLOduration=4.139010285 podStartE2EDuration="4.139010285s" podCreationTimestamp="2026-01-05 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:51:18.138434549 +0000 UTC m=+127.445343138" watchObservedRunningTime="2026-01-05 13:51:18.139010285 +0000 UTC m=+127.445918864" Jan 05 13:51:18 crc kubenswrapper[4740]: I0105 13:51:18.294900 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5msgl"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.671039 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8t6t"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.671844 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k8t6t" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="registry-server" containerID="cri-o://693a1594e8ad2aed0edca20c218625b41b6313f4fe8a1d011abc15898bb6a768" gracePeriod=30 Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.680816 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pkbs6"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.681027 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pkbs6" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="registry-server" containerID="cri-o://0585f307a447b9f904aef33ae7e3c20a4886e3d7da436125939a6546cb4cddcf" gracePeriod=30 Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.687860 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45gg9"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.688038 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" podUID="80bfed68-6820-4458-aa8a-779cc3120e43" containerName="marketplace-operator" containerID="cri-o://4724174628abc536dad3149a35513f30b96fed7b55a35373eb785017010ecf8f" gracePeriod=30 Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.695771 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-br6rk"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.695981 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-br6rk" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="registry-server" containerID="cri-o://df00718016e50606530aad457f5708decb29b963c713eb5b6fdee4880d91743c" gracePeriod=30 Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.702837 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scltx"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.703048 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-scltx" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="registry-server" containerID="cri-o://80b2a567f7dfdf713c838fe9f1a9903af2d43798ca7360770bc43827754d966d" gracePeriod=30 Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.711206 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mrdpz"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.711978 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.724430 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mrdpz"] Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.743851 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22d5567a-2314-42aa-b197-dac963dcbfd1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.743897 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22d5567a-2314-42aa-b197-dac963dcbfd1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.743960 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjx85\" (UniqueName: \"kubernetes.io/projected/22d5567a-2314-42aa-b197-dac963dcbfd1-kube-api-access-xjx85\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.844960 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjx85\" (UniqueName: \"kubernetes.io/projected/22d5567a-2314-42aa-b197-dac963dcbfd1-kube-api-access-xjx85\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.845010 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22d5567a-2314-42aa-b197-dac963dcbfd1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.845033 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22d5567a-2314-42aa-b197-dac963dcbfd1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.846787 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22d5567a-2314-42aa-b197-dac963dcbfd1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.850579 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/22d5567a-2314-42aa-b197-dac963dcbfd1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:20 crc kubenswrapper[4740]: I0105 13:51:20.864779 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjx85\" (UniqueName: \"kubernetes.io/projected/22d5567a-2314-42aa-b197-dac963dcbfd1-kube-api-access-xjx85\") pod \"marketplace-operator-79b997595-mrdpz\" (UID: \"22d5567a-2314-42aa-b197-dac963dcbfd1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.071816 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.101694 4740 generic.go:334] "Generic (PLEG): container finished" podID="80bfed68-6820-4458-aa8a-779cc3120e43" containerID="4724174628abc536dad3149a35513f30b96fed7b55a35373eb785017010ecf8f" exitCode=0 Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.101778 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" event={"ID":"80bfed68-6820-4458-aa8a-779cc3120e43","Type":"ContainerDied","Data":"4724174628abc536dad3149a35513f30b96fed7b55a35373eb785017010ecf8f"} Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.104987 4740 generic.go:334] "Generic (PLEG): container finished" podID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerID="df00718016e50606530aad457f5708decb29b963c713eb5b6fdee4880d91743c" exitCode=0 Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.105202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerDied","Data":"df00718016e50606530aad457f5708decb29b963c713eb5b6fdee4880d91743c"} Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.108748 4740 generic.go:334] "Generic (PLEG): container finished" podID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerID="80b2a567f7dfdf713c838fe9f1a9903af2d43798ca7360770bc43827754d966d" exitCode=0 Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.108852 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerDied","Data":"80b2a567f7dfdf713c838fe9f1a9903af2d43798ca7360770bc43827754d966d"} Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.111951 4740 generic.go:334] "Generic (PLEG): container finished" podID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerID="693a1594e8ad2aed0edca20c218625b41b6313f4fe8a1d011abc15898bb6a768" exitCode=0 Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.112023 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerDied","Data":"693a1594e8ad2aed0edca20c218625b41b6313f4fe8a1d011abc15898bb6a768"} Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.112055 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t6t" event={"ID":"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2","Type":"ContainerDied","Data":"b0249a45d4de176b7d276028e1fb74762afbe6b44a0d121ee2138ccdcad052bd"} Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.112084 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0249a45d4de176b7d276028e1fb74762afbe6b44a0d121ee2138ccdcad052bd" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.115747 4740 generic.go:334] "Generic (PLEG): container finished" podID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerID="0585f307a447b9f904aef33ae7e3c20a4886e3d7da436125939a6546cb4cddcf" exitCode=0 Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.115773 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkbs6" event={"ID":"8655819a-f2c2-4044-85e1-84f7e64cff21","Type":"ContainerDied","Data":"0585f307a447b9f904aef33ae7e3c20a4886e3d7da436125939a6546cb4cddcf"} Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.154547 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.174831 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.180662 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.249978 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-operator-metrics\") pod \"80bfed68-6820-4458-aa8a-779cc3120e43\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250038 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-utilities\") pod \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250107 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-catalog-content\") pod \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250142 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8c5z\" (UniqueName: \"kubernetes.io/projected/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-kube-api-access-x8c5z\") pod \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\" (UID: \"d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250166 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbcbk\" (UniqueName: \"kubernetes.io/projected/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-kube-api-access-pbcbk\") pod \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250193 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mmph\" (UniqueName: \"kubernetes.io/projected/80bfed68-6820-4458-aa8a-779cc3120e43-kube-api-access-2mmph\") pod \"80bfed68-6820-4458-aa8a-779cc3120e43\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250256 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-catalog-content\") pod \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250286 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-trusted-ca\") pod \"80bfed68-6820-4458-aa8a-779cc3120e43\" (UID: \"80bfed68-6820-4458-aa8a-779cc3120e43\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.250330 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-utilities\") pod \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\" (UID: \"2200bbba-e16a-4b74-90b2-9a037bfe0e7d\") " Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.251291 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-utilities" (OuterVolumeSpecName: "utilities") pod "2200bbba-e16a-4b74-90b2-9a037bfe0e7d" (UID: "2200bbba-e16a-4b74-90b2-9a037bfe0e7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.256181 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-kube-api-access-x8c5z" (OuterVolumeSpecName: "kube-api-access-x8c5z") pod "d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" (UID: "d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2"). InnerVolumeSpecName "kube-api-access-x8c5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.258488 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-kube-api-access-pbcbk" (OuterVolumeSpecName: "kube-api-access-pbcbk") pod "2200bbba-e16a-4b74-90b2-9a037bfe0e7d" (UID: "2200bbba-e16a-4b74-90b2-9a037bfe0e7d"). InnerVolumeSpecName "kube-api-access-pbcbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.260350 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "80bfed68-6820-4458-aa8a-779cc3120e43" (UID: "80bfed68-6820-4458-aa8a-779cc3120e43"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.260577 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-utilities" (OuterVolumeSpecName: "utilities") pod "d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" (UID: "d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.261251 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "80bfed68-6820-4458-aa8a-779cc3120e43" (UID: "80bfed68-6820-4458-aa8a-779cc3120e43"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.262519 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bfed68-6820-4458-aa8a-779cc3120e43-kube-api-access-2mmph" (OuterVolumeSpecName: "kube-api-access-2mmph") pod "80bfed68-6820-4458-aa8a-779cc3120e43" (UID: "80bfed68-6820-4458-aa8a-779cc3120e43"). InnerVolumeSpecName "kube-api-access-2mmph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.307760 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" (UID: "d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351573 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351602 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351614 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351657 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8c5z\" (UniqueName: \"kubernetes.io/projected/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2-kube-api-access-x8c5z\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351668 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbcbk\" (UniqueName: \"kubernetes.io/projected/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-kube-api-access-pbcbk\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351676 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mmph\" (UniqueName: \"kubernetes.io/projected/80bfed68-6820-4458-aa8a-779cc3120e43-kube-api-access-2mmph\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351685 4740 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80bfed68-6820-4458-aa8a-779cc3120e43-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.351693 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.387345 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2200bbba-e16a-4b74-90b2-9a037bfe0e7d" (UID: "2200bbba-e16a-4b74-90b2-9a037bfe0e7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.453299 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2200bbba-e16a-4b74-90b2-9a037bfe0e7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:21 crc kubenswrapper[4740]: I0105 13:51:21.616293 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mrdpz"] Jan 05 13:51:21 crc kubenswrapper[4740]: W0105 13:51:21.630184 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d5567a_2314_42aa_b197_dac963dcbfd1.slice/crio-854bfe5858ce7a3ab5b8f3055ac25ac34bcca5916a18be18aa34f5182b79e615 WatchSource:0}: Error finding container 854bfe5858ce7a3ab5b8f3055ac25ac34bcca5916a18be18aa34f5182b79e615: Status 404 returned error can't find the container with id 854bfe5858ce7a3ab5b8f3055ac25ac34bcca5916a18be18aa34f5182b79e615 Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.019217 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.023119 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.059158 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-catalog-content\") pod \"8655819a-f2c2-4044-85e1-84f7e64cff21\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.059220 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-utilities\") pod \"8655819a-f2c2-4044-85e1-84f7e64cff21\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.059251 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fplgh\" (UniqueName: \"kubernetes.io/projected/c70970cb-3f87-4296-96c0-c827e79eaee6-kube-api-access-fplgh\") pod \"c70970cb-3f87-4296-96c0-c827e79eaee6\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.059319 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-catalog-content\") pod \"c70970cb-3f87-4296-96c0-c827e79eaee6\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.059402 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-utilities\") pod \"c70970cb-3f87-4296-96c0-c827e79eaee6\" (UID: \"c70970cb-3f87-4296-96c0-c827e79eaee6\") " Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.059428 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/8655819a-f2c2-4044-85e1-84f7e64cff21-kube-api-access-6kfdw\") pod \"8655819a-f2c2-4044-85e1-84f7e64cff21\" (UID: \"8655819a-f2c2-4044-85e1-84f7e64cff21\") " Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.061867 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-utilities" (OuterVolumeSpecName: "utilities") pod "c70970cb-3f87-4296-96c0-c827e79eaee6" (UID: "c70970cb-3f87-4296-96c0-c827e79eaee6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.062134 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-utilities" (OuterVolumeSpecName: "utilities") pod "8655819a-f2c2-4044-85e1-84f7e64cff21" (UID: "8655819a-f2c2-4044-85e1-84f7e64cff21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.066730 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8655819a-f2c2-4044-85e1-84f7e64cff21-kube-api-access-6kfdw" (OuterVolumeSpecName: "kube-api-access-6kfdw") pod "8655819a-f2c2-4044-85e1-84f7e64cff21" (UID: "8655819a-f2c2-4044-85e1-84f7e64cff21"). InnerVolumeSpecName "kube-api-access-6kfdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.075725 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70970cb-3f87-4296-96c0-c827e79eaee6-kube-api-access-fplgh" (OuterVolumeSpecName: "kube-api-access-fplgh") pod "c70970cb-3f87-4296-96c0-c827e79eaee6" (UID: "c70970cb-3f87-4296-96c0-c827e79eaee6"). InnerVolumeSpecName "kube-api-access-fplgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.094232 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c70970cb-3f87-4296-96c0-c827e79eaee6" (UID: "c70970cb-3f87-4296-96c0-c827e79eaee6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.123926 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8655819a-f2c2-4044-85e1-84f7e64cff21" (UID: "8655819a-f2c2-4044-85e1-84f7e64cff21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.130237 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-scltx" event={"ID":"2200bbba-e16a-4b74-90b2-9a037bfe0e7d","Type":"ContainerDied","Data":"af82b9304183f096c9a7b2e44a6bc0ad784085ab3ebc7be6a888c90ca84d35f6"} Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.130288 4740 scope.go:117] "RemoveContainer" containerID="80b2a567f7dfdf713c838fe9f1a9903af2d43798ca7360770bc43827754d966d" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.130434 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-scltx" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.132422 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" event={"ID":"22d5567a-2314-42aa-b197-dac963dcbfd1","Type":"ContainerStarted","Data":"1d54ab4034586ef7e6efe9eac50c58943c4778578783772216d3aa14d2e4dc9e"} Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.132463 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" event={"ID":"22d5567a-2314-42aa-b197-dac963dcbfd1","Type":"ContainerStarted","Data":"854bfe5858ce7a3ab5b8f3055ac25ac34bcca5916a18be18aa34f5182b79e615"} Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.132686 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.135431 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mrdpz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.61:8080/healthz\": dial tcp 10.217.0.61:8080: connect: connection refused" start-of-body= Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.135475 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" podUID="22d5567a-2314-42aa-b197-dac963dcbfd1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.61:8080/healthz\": dial tcp 10.217.0.61:8080: connect: connection refused" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.141826 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkbs6" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.141911 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkbs6" event={"ID":"8655819a-f2c2-4044-85e1-84f7e64cff21","Type":"ContainerDied","Data":"ed64ace77ae6f75d8e5fee98407219ba72324b09afb7027b8fb4824cb51f4e7d"} Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.144536 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" event={"ID":"80bfed68-6820-4458-aa8a-779cc3120e43","Type":"ContainerDied","Data":"854fa3fc7d86c381b0bfa208811c3eea7363f098b14079329c33da98b6335991"} Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.144662 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45gg9" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.156626 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-br6rk" event={"ID":"c70970cb-3f87-4296-96c0-c827e79eaee6","Type":"ContainerDied","Data":"5d1c561f4db4c1ef45aa873f85d1a76978e1fc5fc0b0d71e2e1dcfdb854891e8"} Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.156661 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-br6rk" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.156643 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t6t" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.156992 4740 scope.go:117] "RemoveContainer" containerID="4094f42065fc96102b08317e3ce818b7ab60ae0c070edc4659d83c15c34743fa" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.157080 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" podStartSLOduration=2.1570498479999998 podStartE2EDuration="2.157049848s" podCreationTimestamp="2026-01-05 13:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:51:22.152015983 +0000 UTC m=+131.458924562" watchObservedRunningTime="2026-01-05 13:51:22.157049848 +0000 UTC m=+131.463958417" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.160422 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.160445 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/8655819a-f2c2-4044-85e1-84f7e64cff21-kube-api-access-6kfdw\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.160454 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.160464 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8655819a-f2c2-4044-85e1-84f7e64cff21-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.160473 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fplgh\" (UniqueName: \"kubernetes.io/projected/c70970cb-3f87-4296-96c0-c827e79eaee6-kube-api-access-fplgh\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.160480 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70970cb-3f87-4296-96c0-c827e79eaee6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.180787 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pkbs6"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.186027 4740 scope.go:117] "RemoveContainer" containerID="9efc908e0ab789a4125e18ba7ab335f178a98495d053fd5976a5a12d6009cb12" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.193744 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pkbs6"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.216437 4740 scope.go:117] "RemoveContainer" containerID="0585f307a447b9f904aef33ae7e3c20a4886e3d7da436125939a6546cb4cddcf" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.226320 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-br6rk"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.230300 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-br6rk"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.230811 4740 scope.go:117] "RemoveContainer" containerID="1343926588d4ec9530449e43fd7a5ef7ab1b02b73bf3092d174888c296d48ace" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.235994 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45gg9"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.243893 4740 scope.go:117] "RemoveContainer" containerID="036bbad9431ac4e1211dcc21964f324b485183cae176769c96320fbd14331c2d" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.243954 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45gg9"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.250016 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8t6t"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.254194 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k8t6t"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.259466 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-scltx"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.262175 4740 scope.go:117] "RemoveContainer" containerID="4724174628abc536dad3149a35513f30b96fed7b55a35373eb785017010ecf8f" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.262287 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-scltx"] Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.276955 4740 scope.go:117] "RemoveContainer" containerID="df00718016e50606530aad457f5708decb29b963c713eb5b6fdee4880d91743c" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.293527 4740 scope.go:117] "RemoveContainer" containerID="249d83df8b127c9f8e1e755b7878bd5e4e7d72ebf8eee64ddd97091e146375cf" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.313164 4740 scope.go:117] "RemoveContainer" containerID="3d5e70b859486172c4f98f41c8dee74d320a3027dccfc05fe536fb4987eeecf8" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.976316 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" path="/var/lib/kubelet/pods/2200bbba-e16a-4b74-90b2-9a037bfe0e7d/volumes" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.977287 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bfed68-6820-4458-aa8a-779cc3120e43" path="/var/lib/kubelet/pods/80bfed68-6820-4458-aa8a-779cc3120e43/volumes" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.977826 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" path="/var/lib/kubelet/pods/8655819a-f2c2-4044-85e1-84f7e64cff21/volumes" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.978952 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" path="/var/lib/kubelet/pods/c70970cb-3f87-4296-96c0-c827e79eaee6/volumes" Jan 05 13:51:22 crc kubenswrapper[4740]: I0105 13:51:22.979595 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" path="/var/lib/kubelet/pods/d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2/volumes" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.172949 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.942819 4740 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.943450 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8" gracePeriod=15 Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.943463 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3" gracePeriod=15 Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.943460 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372" gracePeriod=15 Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.943519 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98" gracePeriod=15 Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.943690 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af" gracePeriod=15 Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944094 4740 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944288 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944305 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944316 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944321 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944331 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944336 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944346 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944353 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944361 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944367 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944373 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944379 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944387 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944393 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944401 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944407 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944415 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944420 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944429 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944434 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944442 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944448 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="extract-content" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944457 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bfed68-6820-4458-aa8a-779cc3120e43" containerName="marketplace-operator" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944462 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bfed68-6820-4458-aa8a-779cc3120e43" containerName="marketplace-operator" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944469 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944474 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944482 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944487 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944494 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944500 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944508 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944513 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944521 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944526 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944537 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944543 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="extract-utilities" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944550 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944556 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944651 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dbc1ff-ca19-4faf-abb5-3b4fe99918f2" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944662 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944669 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944676 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944682 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944689 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bfed68-6820-4458-aa8a-779cc3120e43" containerName="marketplace-operator" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944698 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944704 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8655819a-f2c2-4044-85e1-84f7e64cff21" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944711 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944720 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70970cb-3f87-4296-96c0-c827e79eaee6" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944727 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2200bbba-e16a-4b74-90b2-9a037bfe0e7d" containerName="registry-server" Jan 05 13:51:23 crc kubenswrapper[4740]: E0105 13:51:23.944805 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.944811 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.945874 4740 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.946382 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.949921 4740 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980399 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980464 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980571 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980627 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980650 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980690 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:23 crc kubenswrapper[4740]: I0105 13:51:23.980799 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081514 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081561 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081616 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081652 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081683 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081811 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.081853 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082478 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082522 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082557 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082591 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082624 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082653 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082837 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.082847 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.175197 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.176627 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.177192 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3" exitCode=0 Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.177212 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af" exitCode=0 Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.177218 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372" exitCode=0 Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.177224 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98" exitCode=2 Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.177282 4740 scope.go:117] "RemoveContainer" containerID="ea6209a030996cf73f4372648fdbeddee16d680ca0366c40ba7e1a2a42b5ac5b" Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.178431 4740 generic.go:334] "Generic (PLEG): container finished" podID="14e93303-35a5-401b-b6d0-49931cb458eb" containerID="194a48d4c18dd856a9f530a03a2bb131e5315d60d7fb8ab84dc1f2ab32f9f5e3" exitCode=0 Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.178515 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14e93303-35a5-401b-b6d0-49931cb458eb","Type":"ContainerDied","Data":"194a48d4c18dd856a9f530a03a2bb131e5315d60d7fb8ab84dc1f2ab32f9f5e3"} Jan 05 13:51:24 crc kubenswrapper[4740]: I0105 13:51:24.179138 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.044537 4740 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.045493 4740 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.045823 4740 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.046373 4740 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.047018 4740 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.047110 4740 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.047468 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="200ms" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.186124 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.248983 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="400ms" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.529504 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.530093 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597035 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-var-lock\") pod \"14e93303-35a5-401b-b6d0-49931cb458eb\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597158 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-var-lock" (OuterVolumeSpecName: "var-lock") pod "14e93303-35a5-401b-b6d0-49931cb458eb" (UID: "14e93303-35a5-401b-b6d0-49931cb458eb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597199 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-kubelet-dir\") pod \"14e93303-35a5-401b-b6d0-49931cb458eb\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597295 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14e93303-35a5-401b-b6d0-49931cb458eb-kube-api-access\") pod \"14e93303-35a5-401b-b6d0-49931cb458eb\" (UID: \"14e93303-35a5-401b-b6d0-49931cb458eb\") " Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597299 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "14e93303-35a5-401b-b6d0-49931cb458eb" (UID: "14e93303-35a5-401b-b6d0-49931cb458eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597514 4740 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-var-lock\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.597543 4740 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14e93303-35a5-401b-b6d0-49931cb458eb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.604664 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e93303-35a5-401b-b6d0-49931cb458eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "14e93303-35a5-401b-b6d0-49931cb458eb" (UID: "14e93303-35a5-401b-b6d0-49931cb458eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:25 crc kubenswrapper[4740]: E0105 13:51:25.650816 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="800ms" Jan 05 13:51:25 crc kubenswrapper[4740]: I0105 13:51:25.698569 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14e93303-35a5-401b-b6d0-49931cb458eb-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.194645 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14e93303-35a5-401b-b6d0-49931cb458eb","Type":"ContainerDied","Data":"dd9c32b9602d80869b2d8c313691e59ced83d60a20430f5636394ea82f82676d"} Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.194984 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd9c32b9602d80869b2d8c313691e59ced83d60a20430f5636394ea82f82676d" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.194694 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.207272 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.347621 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.348552 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.349202 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.349837 4740 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.412840 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.412947 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413004 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413183 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413189 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413291 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413620 4740 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413655 4740 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.413673 4740 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:26 crc kubenswrapper[4740]: E0105 13:51:26.453009 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="1.6s" Jan 05 13:51:26 crc kubenswrapper[4740]: I0105 13:51:26.979753 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.202737 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.204228 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8" exitCode=0 Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.204284 4740 scope.go:117] "RemoveContainer" containerID="917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.204399 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.205153 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.205321 4740 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.207953 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.208586 4740 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.232957 4740 scope.go:117] "RemoveContainer" containerID="8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.248663 4740 scope.go:117] "RemoveContainer" containerID="73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.266325 4740 scope.go:117] "RemoveContainer" containerID="cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.285304 4740 scope.go:117] "RemoveContainer" containerID="af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.304309 4740 scope.go:117] "RemoveContainer" containerID="55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.324612 4740 scope.go:117] "RemoveContainer" containerID="917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3" Jan 05 13:51:27 crc kubenswrapper[4740]: E0105 13:51:27.325645 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3\": container with ID starting with 917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3 not found: ID does not exist" containerID="917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.325679 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3"} err="failed to get container status \"917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3\": rpc error: code = NotFound desc = could not find container \"917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3\": container with ID starting with 917ea24c6908e93ed0cfebe58cb70be97a7506c6f1e4fa4897c5338f719a2ae3 not found: ID does not exist" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.325714 4740 scope.go:117] "RemoveContainer" containerID="8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af" Jan 05 13:51:27 crc kubenswrapper[4740]: E0105 13:51:27.326080 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\": container with ID starting with 8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af not found: ID does not exist" containerID="8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.326114 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af"} err="failed to get container status \"8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\": rpc error: code = NotFound desc = could not find container \"8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af\": container with ID starting with 8521c28bbe24bdc03c481b2c1c6e7fdbd7e0dff404d66600380c78da826a03af not found: ID does not exist" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.326135 4740 scope.go:117] "RemoveContainer" containerID="73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372" Jan 05 13:51:27 crc kubenswrapper[4740]: E0105 13:51:27.326451 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\": container with ID starting with 73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372 not found: ID does not exist" containerID="73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.326487 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372"} err="failed to get container status \"73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\": rpc error: code = NotFound desc = could not find container \"73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372\": container with ID starting with 73a92a17999d6c21802b190bb9c92d479fbb1bdb6bdf8296aa7261a24648e372 not found: ID does not exist" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.326513 4740 scope.go:117] "RemoveContainer" containerID="cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98" Jan 05 13:51:27 crc kubenswrapper[4740]: E0105 13:51:27.326755 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\": container with ID starting with cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98 not found: ID does not exist" containerID="cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.326780 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98"} err="failed to get container status \"cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\": rpc error: code = NotFound desc = could not find container \"cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98\": container with ID starting with cceba3a93ce2fe2e88dfbf0a42bde5732d11950d69f4551b08901b490ff34a98 not found: ID does not exist" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.326795 4740 scope.go:117] "RemoveContainer" containerID="af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8" Jan 05 13:51:27 crc kubenswrapper[4740]: E0105 13:51:27.327311 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\": container with ID starting with af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8 not found: ID does not exist" containerID="af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.327333 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8"} err="failed to get container status \"af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\": rpc error: code = NotFound desc = could not find container \"af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8\": container with ID starting with af635a52f5998f29d57eb30e524277e0de1a44bd16186765a90c89a436d081c8 not found: ID does not exist" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.327350 4740 scope.go:117] "RemoveContainer" containerID="55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70" Jan 05 13:51:27 crc kubenswrapper[4740]: E0105 13:51:27.328042 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\": container with ID starting with 55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70 not found: ID does not exist" containerID="55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70" Jan 05 13:51:27 crc kubenswrapper[4740]: I0105 13:51:27.328081 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70"} err="failed to get container status \"55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\": rpc error: code = NotFound desc = could not find container \"55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70\": container with ID starting with 55d1e6303d2f7c553976b4d276565af3aea21727950b610ffb78f1e30319ef70 not found: ID does not exist" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.054181 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="3.2s" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.528554 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:51:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:51:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:51:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-05T13:51:28Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.529856 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.530450 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.530830 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.531258 4740 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.531297 4740 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 05 13:51:28 crc kubenswrapper[4740]: E0105 13:51:28.986362 4740 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.97:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:28 crc kubenswrapper[4740]: I0105 13:51:28.986723 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:29 crc kubenswrapper[4740]: E0105 13:51:29.005147 4740 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.97:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1887da03e39c75b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-05 13:51:29.004647862 +0000 UTC m=+138.311556481,LastTimestamp:2026-01-05 13:51:29.004647862 +0000 UTC m=+138.311556481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 05 13:51:29 crc kubenswrapper[4740]: I0105 13:51:29.217621 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"723c3895c69b5dd21f426b47208b0635103896272707f740a4f2b585940946b9"} Jan 05 13:51:30 crc kubenswrapper[4740]: I0105 13:51:30.230406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"077e6ae7822e64b656eaacca7fb1d0ba1715810d0ff435eb6d93883676c57e44"} Jan 05 13:51:30 crc kubenswrapper[4740]: I0105 13:51:30.230931 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:30 crc kubenswrapper[4740]: E0105 13:51:30.231051 4740 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.97:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:30 crc kubenswrapper[4740]: I0105 13:51:30.972844 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:31 crc kubenswrapper[4740]: E0105 13:51:31.238534 4740 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.97:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:51:31 crc kubenswrapper[4740]: E0105 13:51:31.255050 4740 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.97:6443: connect: connection refused" interval="6.4s" Jan 05 13:51:34 crc kubenswrapper[4740]: I0105 13:51:34.968231 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:34 crc kubenswrapper[4740]: I0105 13:51:34.969380 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:34 crc kubenswrapper[4740]: I0105 13:51:34.988466 4740 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:34 crc kubenswrapper[4740]: I0105 13:51:34.988511 4740 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:34 crc kubenswrapper[4740]: E0105 13:51:34.989011 4740 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:34 crc kubenswrapper[4740]: I0105 13:51:34.989573 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:35 crc kubenswrapper[4740]: W0105 13:51:35.010107 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-faf4d2d84f3523762e7e53ad31cf359de1c2000d559c61a03f23680879b5bc57 WatchSource:0}: Error finding container faf4d2d84f3523762e7e53ad31cf359de1c2000d559c61a03f23680879b5bc57: Status 404 returned error can't find the container with id faf4d2d84f3523762e7e53ad31cf359de1c2000d559c61a03f23680879b5bc57 Jan 05 13:51:35 crc kubenswrapper[4740]: I0105 13:51:35.269995 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ecd233231393f099e8592c3f80bb6a1af5348bd0eac560ca77eb5174f7d58597"} Jan 05 13:51:35 crc kubenswrapper[4740]: I0105 13:51:35.270379 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"faf4d2d84f3523762e7e53ad31cf359de1c2000d559c61a03f23680879b5bc57"} Jan 05 13:51:35 crc kubenswrapper[4740]: I0105 13:51:35.270759 4740 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:35 crc kubenswrapper[4740]: I0105 13:51:35.270783 4740 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:35 crc kubenswrapper[4740]: E0105 13:51:35.271224 4740 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:35 crc kubenswrapper[4740]: I0105 13:51:35.271223 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:36 crc kubenswrapper[4740]: I0105 13:51:36.281841 4740 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ecd233231393f099e8592c3f80bb6a1af5348bd0eac560ca77eb5174f7d58597" exitCode=0 Jan 05 13:51:36 crc kubenswrapper[4740]: I0105 13:51:36.281925 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ecd233231393f099e8592c3f80bb6a1af5348bd0eac560ca77eb5174f7d58597"} Jan 05 13:51:36 crc kubenswrapper[4740]: I0105 13:51:36.282369 4740 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:36 crc kubenswrapper[4740]: I0105 13:51:36.282392 4740 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:36 crc kubenswrapper[4740]: I0105 13:51:36.282974 4740 status_manager.go:851] "Failed to get status for pod" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" Jan 05 13:51:36 crc kubenswrapper[4740]: E0105 13:51:36.282983 4740 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.97:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:36 crc kubenswrapper[4740]: E0105 13:51:36.500799 4740 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.97:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1887da03e39c75b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-05 13:51:29.004647862 +0000 UTC m=+138.311556481,LastTimestamp:2026-01-05 13:51:29.004647862 +0000 UTC m=+138.311556481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 05 13:51:37 crc kubenswrapper[4740]: I0105 13:51:37.291974 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"28cda2e61e592e473c7ea0753b3d903b9a8f500ff88c03039ef0383ac6bcc261"} Jan 05 13:51:37 crc kubenswrapper[4740]: I0105 13:51:37.292335 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"09c33b4513d752de81f1175a0bad79489b5d7fbccd650ab692e06d870ee466aa"} Jan 05 13:51:37 crc kubenswrapper[4740]: I0105 13:51:37.292349 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c5099a4c41ae83065f0806762d0efa66968e3d4c2fe83cf4dde8767584a600d6"} Jan 05 13:51:38 crc kubenswrapper[4740]: I0105 13:51:38.300429 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"73dec76c59281f3fb619349e7dcb6d33cecd475c2984732779e18a22454d1f1b"} Jan 05 13:51:38 crc kubenswrapper[4740]: I0105 13:51:38.300477 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ec94b4a429fcf9797259eb84ab83e854d2f613d89fcfdd79f694924419604ff8"} Jan 05 13:51:38 crc kubenswrapper[4740]: I0105 13:51:38.300643 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:38 crc kubenswrapper[4740]: I0105 13:51:38.300748 4740 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:38 crc kubenswrapper[4740]: I0105 13:51:38.300775 4740 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.309125 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.309172 4740 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="72eeec19f8ced0f5b2113145af58085f9ec9942ac6f7d1d106f740f869a8b70a" exitCode=1 Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.309200 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"72eeec19f8ced0f5b2113145af58085f9ec9942ac6f7d1d106f740f869a8b70a"} Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.309591 4740 scope.go:117] "RemoveContainer" containerID="72eeec19f8ced0f5b2113145af58085f9ec9942ac6f7d1d106f740f869a8b70a" Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.990041 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.990476 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:39 crc kubenswrapper[4740]: I0105 13:51:39.997224 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:40 crc kubenswrapper[4740]: I0105 13:51:40.321749 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 05 13:51:40 crc kubenswrapper[4740]: I0105 13:51:40.321825 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8b385bd79c82105dd7a18e6a7f2e166f98b30c51233c5eab98decd34d044fe96"} Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.319424 4740 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.322420 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" podUID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" containerName="oauth-openshift" containerID="cri-o://bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4" gracePeriod=15 Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.379827 4740 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fb43c882-12e5-4e24-ae34-46bf86275b26" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.391673 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.845757 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959214 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-session\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-dir\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959329 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-trusted-ca-bundle\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959358 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl88j\" (UniqueName: \"kubernetes.io/projected/8b957b57-a5bc-43d9-acf0-88eb3b539af4-kube-api-access-kl88j\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959374 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959403 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-cliconfig\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959524 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-idp-0-file-data\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959614 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-policies\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959705 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-service-ca\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959783 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-ocp-branding-template\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959838 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-login\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959895 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-router-certs\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.959970 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-serving-cert\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960020 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-error\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960115 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-provider-selection\") pod \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\" (UID: \"8b957b57-a5bc-43d9-acf0-88eb3b539af4\") " Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960186 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960307 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960374 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960602 4740 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960621 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960634 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.960645 4740 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.961217 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.965853 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.967430 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.969544 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.969570 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b957b57-a5bc-43d9-acf0-88eb3b539af4-kube-api-access-kl88j" (OuterVolumeSpecName: "kube-api-access-kl88j") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "kube-api-access-kl88j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.969854 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.970372 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.970550 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.970944 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:43 crc kubenswrapper[4740]: I0105 13:51:43.979510 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8b957b57-a5bc-43d9-acf0-88eb3b539af4" (UID: "8b957b57-a5bc-43d9-acf0-88eb3b539af4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062451 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062512 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062542 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl88j\" (UniqueName: \"kubernetes.io/projected/8b957b57-a5bc-43d9-acf0-88eb3b539af4-kube-api-access-kl88j\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062567 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062597 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062623 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062650 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062674 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062698 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.062724 4740 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b957b57-a5bc-43d9-acf0-88eb3b539af4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.347530 4740 generic.go:334] "Generic (PLEG): container finished" podID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" containerID="bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4" exitCode=0 Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.347627 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.347660 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" event={"ID":"8b957b57-a5bc-43d9-acf0-88eb3b539af4","Type":"ContainerDied","Data":"bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4"} Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.347755 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5msgl" event={"ID":"8b957b57-a5bc-43d9-acf0-88eb3b539af4","Type":"ContainerDied","Data":"89283df55b398c863ed0c898cba5ab290a3f05873ee0faae6152dd0528590d8f"} Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.347771 4740 scope.go:117] "RemoveContainer" containerID="bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.348131 4740 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.348152 4740 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ff7aaeed-7c7e-43d0-bf51-9f92524eae1a" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.359428 4740 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fb43c882-12e5-4e24-ae34-46bf86275b26" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.365007 4740 scope.go:117] "RemoveContainer" containerID="bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4" Jan 05 13:51:44 crc kubenswrapper[4740]: E0105 13:51:44.365391 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4\": container with ID starting with bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4 not found: ID does not exist" containerID="bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4" Jan 05 13:51:44 crc kubenswrapper[4740]: I0105 13:51:44.365423 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4"} err="failed to get container status \"bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4\": rpc error: code = NotFound desc = could not find container \"bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4\": container with ID starting with bff4644999fd8d3ffb1be1a6d2f75e99febdd785f719001fd9d2e144b80553d4 not found: ID does not exist" Jan 05 13:51:44 crc kubenswrapper[4740]: E0105 13:51:44.521706 4740 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Jan 05 13:51:47 crc kubenswrapper[4740]: I0105 13:51:47.808056 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:51:47 crc kubenswrapper[4740]: I0105 13:51:47.819245 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:51:52 crc kubenswrapper[4740]: I0105 13:51:52.779586 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 05 13:51:53 crc kubenswrapper[4740]: I0105 13:51:53.383855 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 05 13:51:53 crc kubenswrapper[4740]: I0105 13:51:53.398404 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 05 13:51:53 crc kubenswrapper[4740]: I0105 13:51:53.814274 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 05 13:51:54 crc kubenswrapper[4740]: I0105 13:51:54.268621 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 05 13:51:54 crc kubenswrapper[4740]: I0105 13:51:54.496371 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 05 13:51:54 crc kubenswrapper[4740]: I0105 13:51:54.565751 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 05 13:51:55 crc kubenswrapper[4740]: I0105 13:51:55.219158 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 05 13:51:55 crc kubenswrapper[4740]: I0105 13:51:55.636882 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 05 13:51:55 crc kubenswrapper[4740]: I0105 13:51:55.937409 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.179401 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.225628 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.319491 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.326415 4740 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.355971 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.456405 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.517760 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.540740 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.548479 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.580945 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.622997 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 05 13:51:56 crc kubenswrapper[4740]: I0105 13:51:56.891204 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.093873 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.109557 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.205572 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.275301 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.491514 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.689177 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.781447 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.801089 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.895326 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.897558 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.922476 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 05 13:51:57 crc kubenswrapper[4740]: I0105 13:51:57.996134 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.018533 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.039943 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.057820 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.087835 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.134820 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.234523 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.236079 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.391775 4740 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.403136 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.423957 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.641008 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.641763 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.713552 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.736908 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.799258 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.890899 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.893298 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 05 13:51:58 crc kubenswrapper[4740]: I0105 13:51:58.913193 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.058623 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.168267 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.356297 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.414780 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.431571 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.431578 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.544451 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.546679 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.574055 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.770594 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.865220 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.918830 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 05 13:51:59 crc kubenswrapper[4740]: I0105 13:51:59.933606 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.028775 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.051924 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.073194 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.126442 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.267465 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.315292 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.397873 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.454625 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.497253 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.526542 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.542402 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.551081 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.562957 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.641443 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.778846 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.809450 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.852689 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.869956 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 05 13:52:00 crc kubenswrapper[4740]: I0105 13:52:00.901887 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.029865 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.078155 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.152670 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.201293 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.206129 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.223482 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.269326 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.286318 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.365116 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.431378 4740 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.458944 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.583694 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.639567 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.650361 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.742652 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.784249 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.784324 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.807196 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.839609 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.916771 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.917387 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:52:01 crc kubenswrapper[4740]: I0105 13:52:01.930926 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.057436 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.187388 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.197127 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.322419 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.378951 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.476347 4740 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.523142 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.527675 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.543595 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.624936 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.711410 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.765816 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.843208 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.914512 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 05 13:52:02 crc kubenswrapper[4740]: I0105 13:52:02.920230 4740 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.132428 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.145611 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.160637 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.166499 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.168472 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.262616 4740 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.264021 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266267 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5msgl","openshift-kube-apiserver/kube-apiserver-crc"] Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266327 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-68b95d957b-fwtrs"] Jan 05 13:52:03 crc kubenswrapper[4740]: E0105 13:52:03.266479 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" containerName="installer" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266490 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" containerName="installer" Jan 05 13:52:03 crc kubenswrapper[4740]: E0105 13:52:03.266500 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" containerName="oauth-openshift" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266506 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" containerName="oauth-openshift" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266576 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" containerName="oauth-openshift" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266590 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e93303-35a5-401b-b6d0-49931cb458eb" containerName="installer" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.266900 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.294129 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.294473 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.295952 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.305415 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.306342 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.306756 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.307189 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.307380 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.306778 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.307276 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.307212 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.307319 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.310642 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.316335 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.320863 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.323285 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.336060 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.362448 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.362426581 podStartE2EDuration="20.362426581s" podCreationTimestamp="2026-01-05 13:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:03.341441099 +0000 UTC m=+172.648349678" watchObservedRunningTime="2026-01-05 13:52:03.362426581 +0000 UTC m=+172.669335180" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.383342 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417261 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417325 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-login\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417467 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-audit-policies\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417502 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-audit-dir\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417541 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417566 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417594 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-error\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417714 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2sxp\" (UniqueName: \"kubernetes.io/projected/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-kube-api-access-g2sxp\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417786 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417820 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-session\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417874 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417917 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.417989 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.518576 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520199 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520276 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-login\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520309 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-audit-policies\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520331 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-audit-dir\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520354 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520379 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520691 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-error\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520729 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2sxp\" (UniqueName: \"kubernetes.io/projected/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-kube-api-access-g2sxp\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520753 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-session\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520807 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520828 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.520861 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.521331 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-audit-dir\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.521985 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-audit-policies\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.522302 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.522824 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.523228 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.526225 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-session\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.527723 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.528135 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-error\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.528476 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.528787 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.529184 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.536507 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-user-template-login\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.536539 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.540737 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2sxp\" (UniqueName: \"kubernetes.io/projected/4e0355b5-87f3-4eeb-b13f-8287d9cb0786-kube-api-access-g2sxp\") pod \"oauth-openshift-68b95d957b-fwtrs\" (UID: \"4e0355b5-87f3-4eeb-b13f-8287d9cb0786\") " pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.580599 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.583220 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.630741 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.751549 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.805785 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.807461 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.867831 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.977324 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 05 13:52:03 crc kubenswrapper[4740]: I0105 13:52:03.979401 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.003585 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.048691 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.063995 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.079562 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b95d957b-fwtrs"] Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.166403 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.222463 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.236690 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.308029 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.334178 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.339808 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.468271 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" event={"ID":"4e0355b5-87f3-4eeb-b13f-8287d9cb0786","Type":"ContainerStarted","Data":"6692bd47e06778460ebb8929e436bcb7d4f792d4c92347cc8b2fea6fed35b94d"} Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.468340 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" event={"ID":"4e0355b5-87f3-4eeb-b13f-8287d9cb0786","Type":"ContainerStarted","Data":"9649a52a13bab4ada2ba4cbc7f0616f408f79ed9c5d254bbc6cdf5bca9dcc54e"} Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.470003 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.491041 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.496361 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podStartSLOduration=46.496335308 podStartE2EDuration="46.496335308s" podCreationTimestamp="2026-01-05 13:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:04.489329004 +0000 UTC m=+173.796237643" watchObservedRunningTime="2026-01-05 13:52:04.496335308 +0000 UTC m=+173.803243917" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.559144 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.599030 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.626970 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.636213 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.643408 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.661483 4740 patch_prober.go:28] interesting pod/oauth-openshift-68b95d957b-fwtrs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": read tcp 10.217.0.2:39374->10.217.0.62:6443: read: connection reset by peer" start-of-body= Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.661531 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": read tcp 10.217.0.2:39374->10.217.0.62:6443: read: connection reset by peer" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.666187 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.686868 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.690361 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.690493 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.723908 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.758541 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.781051 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.800228 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.801853 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.803572 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.885621 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.907916 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.940154 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 05 13:52:04 crc kubenswrapper[4740]: I0105 13:52:04.978341 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b957b57-a5bc-43d9-acf0-88eb3b539af4" path="/var/lib/kubelet/pods/8b957b57-a5bc-43d9-acf0-88eb3b539af4/volumes" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.031363 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.035232 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.046782 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.060707 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.108586 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.165575 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.208161 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.224776 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.337441 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.362706 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.410797 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.454235 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.455096 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.473190 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-68b95d957b-fwtrs_4e0355b5-87f3-4eeb-b13f-8287d9cb0786/oauth-openshift/0.log" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.473238 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerID="6692bd47e06778460ebb8929e436bcb7d4f792d4c92347cc8b2fea6fed35b94d" exitCode=255 Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.473334 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" event={"ID":"4e0355b5-87f3-4eeb-b13f-8287d9cb0786","Type":"ContainerDied","Data":"6692bd47e06778460ebb8929e436bcb7d4f792d4c92347cc8b2fea6fed35b94d"} Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.473735 4740 scope.go:117] "RemoveContainer" containerID="6692bd47e06778460ebb8929e436bcb7d4f792d4c92347cc8b2fea6fed35b94d" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.483852 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.552395 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.712401 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.736445 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.992490 4740 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 05 13:52:05 crc kubenswrapper[4740]: I0105 13:52:05.992932 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://077e6ae7822e64b656eaacca7fb1d0ba1715810d0ff435eb6d93883676c57e44" gracePeriod=5 Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.175523 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.234117 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.242563 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.277155 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.279990 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.392602 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.484702 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-68b95d957b-fwtrs_4e0355b5-87f3-4eeb-b13f-8287d9cb0786/oauth-openshift/1.log" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.485408 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-68b95d957b-fwtrs_4e0355b5-87f3-4eeb-b13f-8287d9cb0786/oauth-openshift/0.log" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.485465 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerID="03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215" exitCode=255 Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.485504 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" event={"ID":"4e0355b5-87f3-4eeb-b13f-8287d9cb0786","Type":"ContainerDied","Data":"03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215"} Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.485547 4740 scope.go:117] "RemoveContainer" containerID="6692bd47e06778460ebb8929e436bcb7d4f792d4c92347cc8b2fea6fed35b94d" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.486425 4740 scope.go:117] "RemoveContainer" containerID="03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215" Jan 05 13:52:06 crc kubenswrapper[4740]: E0105 13:52:06.486701 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-68b95d957b-fwtrs_openshift-authentication(4e0355b5-87f3-4eeb-b13f-8287d9cb0786)\"" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.759136 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.760221 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 05 13:52:06 crc kubenswrapper[4740]: I0105 13:52:06.956504 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.058018 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.090563 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.124189 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.197499 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.318036 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.380382 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.393929 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.493418 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-68b95d957b-fwtrs_4e0355b5-87f3-4eeb-b13f-8287d9cb0786/oauth-openshift/1.log" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.493928 4740 scope.go:117] "RemoveContainer" containerID="03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215" Jan 05 13:52:07 crc kubenswrapper[4740]: E0105 13:52:07.494278 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-68b95d957b-fwtrs_openshift-authentication(4e0355b5-87f3-4eeb-b13f-8287d9cb0786)\"" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.599519 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.707684 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.820828 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.824912 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 05 13:52:07 crc kubenswrapper[4740]: I0105 13:52:07.843456 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.034557 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.073789 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.088839 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.317509 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.430167 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.449477 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.451823 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.765566 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.773318 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.784313 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.811632 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.911336 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.936565 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 05 13:52:08 crc kubenswrapper[4740]: I0105 13:52:08.987734 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 05 13:52:09 crc kubenswrapper[4740]: I0105 13:52:09.009448 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 05 13:52:09 crc kubenswrapper[4740]: I0105 13:52:09.257078 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 05 13:52:09 crc kubenswrapper[4740]: I0105 13:52:09.541885 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 05 13:52:09 crc kubenswrapper[4740]: I0105 13:52:09.547162 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.007937 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.050138 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.051178 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.303356 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.400916 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.417391 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 05 13:52:10 crc kubenswrapper[4740]: I0105 13:52:10.532019 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.207823 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.311348 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.317232 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.342103 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.529158 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.529232 4740 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="077e6ae7822e64b656eaacca7fb1d0ba1715810d0ff435eb6d93883676c57e44" exitCode=137 Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.591886 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.593959 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.594021 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714480 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714599 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714696 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714771 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714835 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714895 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714914 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.714986 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.715092 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.715341 4740 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.715361 4740 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.715375 4740 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.715386 4740 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.724820 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:52:11 crc kubenswrapper[4740]: I0105 13:52:11.815996 4740 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:12 crc kubenswrapper[4740]: I0105 13:52:12.095953 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 05 13:52:12 crc kubenswrapper[4740]: I0105 13:52:12.540638 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 05 13:52:12 crc kubenswrapper[4740]: I0105 13:52:12.540723 4740 scope.go:117] "RemoveContainer" containerID="077e6ae7822e64b656eaacca7fb1d0ba1715810d0ff435eb6d93883676c57e44" Jan 05 13:52:12 crc kubenswrapper[4740]: I0105 13:52:12.540858 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 05 13:52:12 crc kubenswrapper[4740]: I0105 13:52:12.979641 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 05 13:52:13 crc kubenswrapper[4740]: I0105 13:52:13.187926 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 05 13:52:13 crc kubenswrapper[4740]: I0105 13:52:13.631390 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:13 crc kubenswrapper[4740]: I0105 13:52:13.632136 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:13 crc kubenswrapper[4740]: I0105 13:52:13.632217 4740 scope.go:117] "RemoveContainer" containerID="03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215" Jan 05 13:52:13 crc kubenswrapper[4740]: E0105 13:52:13.632615 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-68b95d957b-fwtrs_openshift-authentication(4e0355b5-87f3-4eeb-b13f-8287d9cb0786)\"" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.501558 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-547b88d74f-q6php"] Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.501852 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" podUID="f711b0b3-8883-4012-8cbc-ef337e8c074d" containerName="controller-manager" containerID="cri-o://4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc" gracePeriod=30 Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.507622 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7"] Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.507873 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" podUID="ab275941-b160-4fac-9026-dc7476466272" containerName="route-controller-manager" containerID="cri-o://e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee" gracePeriod=30 Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.552826 4740 scope.go:117] "RemoveContainer" containerID="03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215" Jan 05 13:52:14 crc kubenswrapper[4740]: E0105 13:52:14.553001 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-68b95d957b-fwtrs_openshift-authentication(4e0355b5-87f3-4eeb-b13f-8287d9cb0786)\"" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.948411 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.952844 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgqgk\" (UniqueName: \"kubernetes.io/projected/ab275941-b160-4fac-9026-dc7476466272-kube-api-access-kgqgk\") pod \"ab275941-b160-4fac-9026-dc7476466272\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.952886 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab275941-b160-4fac-9026-dc7476466272-serving-cert\") pod \"ab275941-b160-4fac-9026-dc7476466272\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.952907 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-config\") pod \"ab275941-b160-4fac-9026-dc7476466272\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.952959 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-client-ca\") pod \"ab275941-b160-4fac-9026-dc7476466272\" (UID: \"ab275941-b160-4fac-9026-dc7476466272\") " Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.953776 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab275941-b160-4fac-9026-dc7476466272" (UID: "ab275941-b160-4fac-9026-dc7476466272"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.953964 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-config" (OuterVolumeSpecName: "config") pod "ab275941-b160-4fac-9026-dc7476466272" (UID: "ab275941-b160-4fac-9026-dc7476466272"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.960466 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab275941-b160-4fac-9026-dc7476466272-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab275941-b160-4fac-9026-dc7476466272" (UID: "ab275941-b160-4fac-9026-dc7476466272"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:52:14 crc kubenswrapper[4740]: I0105 13:52:14.970833 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab275941-b160-4fac-9026-dc7476466272-kube-api-access-kgqgk" (OuterVolumeSpecName: "kube-api-access-kgqgk") pod "ab275941-b160-4fac-9026-dc7476466272" (UID: "ab275941-b160-4fac-9026-dc7476466272"). InnerVolumeSpecName "kube-api-access-kgqgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.030699 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.084008 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.084287 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgqgk\" (UniqueName: \"kubernetes.io/projected/ab275941-b160-4fac-9026-dc7476466272-kube-api-access-kgqgk\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.084310 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab275941-b160-4fac-9026-dc7476466272-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.084346 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab275941-b160-4fac-9026-dc7476466272-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.185157 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f711b0b3-8883-4012-8cbc-ef337e8c074d-serving-cert\") pod \"f711b0b3-8883-4012-8cbc-ef337e8c074d\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.185245 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-client-ca\") pod \"f711b0b3-8883-4012-8cbc-ef337e8c074d\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.185328 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-proxy-ca-bundles\") pod \"f711b0b3-8883-4012-8cbc-ef337e8c074d\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.185365 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-config\") pod \"f711b0b3-8883-4012-8cbc-ef337e8c074d\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.185453 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg6cr\" (UniqueName: \"kubernetes.io/projected/f711b0b3-8883-4012-8cbc-ef337e8c074d-kube-api-access-zg6cr\") pod \"f711b0b3-8883-4012-8cbc-ef337e8c074d\" (UID: \"f711b0b3-8883-4012-8cbc-ef337e8c074d\") " Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.186494 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-client-ca" (OuterVolumeSpecName: "client-ca") pod "f711b0b3-8883-4012-8cbc-ef337e8c074d" (UID: "f711b0b3-8883-4012-8cbc-ef337e8c074d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.186520 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-config" (OuterVolumeSpecName: "config") pod "f711b0b3-8883-4012-8cbc-ef337e8c074d" (UID: "f711b0b3-8883-4012-8cbc-ef337e8c074d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.187112 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f711b0b3-8883-4012-8cbc-ef337e8c074d" (UID: "f711b0b3-8883-4012-8cbc-ef337e8c074d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.188803 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f711b0b3-8883-4012-8cbc-ef337e8c074d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f711b0b3-8883-4012-8cbc-ef337e8c074d" (UID: "f711b0b3-8883-4012-8cbc-ef337e8c074d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.189390 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f711b0b3-8883-4012-8cbc-ef337e8c074d-kube-api-access-zg6cr" (OuterVolumeSpecName: "kube-api-access-zg6cr") pod "f711b0b3-8883-4012-8cbc-ef337e8c074d" (UID: "f711b0b3-8883-4012-8cbc-ef337e8c074d"). InnerVolumeSpecName "kube-api-access-zg6cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.287090 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.287134 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.287154 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg6cr\" (UniqueName: \"kubernetes.io/projected/f711b0b3-8883-4012-8cbc-ef337e8c074d-kube-api-access-zg6cr\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.287177 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f711b0b3-8883-4012-8cbc-ef337e8c074d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.287195 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f711b0b3-8883-4012-8cbc-ef337e8c074d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.561142 4740 generic.go:334] "Generic (PLEG): container finished" podID="f711b0b3-8883-4012-8cbc-ef337e8c074d" containerID="4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc" exitCode=0 Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.561223 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.561276 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" event={"ID":"f711b0b3-8883-4012-8cbc-ef337e8c074d","Type":"ContainerDied","Data":"4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc"} Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.561351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-547b88d74f-q6php" event={"ID":"f711b0b3-8883-4012-8cbc-ef337e8c074d","Type":"ContainerDied","Data":"843d76e1346b98c8c870a0e67432509ed4b5e00c2827c7cb65ca9d1eee67711e"} Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.561381 4740 scope.go:117] "RemoveContainer" containerID="4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.565050 4740 generic.go:334] "Generic (PLEG): container finished" podID="ab275941-b160-4fac-9026-dc7476466272" containerID="e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee" exitCode=0 Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.565144 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" event={"ID":"ab275941-b160-4fac-9026-dc7476466272","Type":"ContainerDied","Data":"e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee"} Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.565198 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" event={"ID":"ab275941-b160-4fac-9026-dc7476466272","Type":"ContainerDied","Data":"6328977b655a260b39d474a94fbef4430707020702c2f07f72522f70671b47c1"} Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.565589 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.587976 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7"] Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.589521 4740 scope.go:117] "RemoveContainer" containerID="4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc" Jan 05 13:52:15 crc kubenswrapper[4740]: E0105 13:52:15.590233 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc\": container with ID starting with 4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc not found: ID does not exist" containerID="4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.590279 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc"} err="failed to get container status \"4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc\": rpc error: code = NotFound desc = could not find container \"4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc\": container with ID starting with 4db5b067ae2fd8a6570ebf9dacfeff842728d969cebda67d5094a1a699e556dc not found: ID does not exist" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.590314 4740 scope.go:117] "RemoveContainer" containerID="e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.594552 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d6bb8bd5f-cspx7"] Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.613309 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-547b88d74f-q6php"] Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.618235 4740 scope.go:117] "RemoveContainer" containerID="e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee" Jan 05 13:52:15 crc kubenswrapper[4740]: E0105 13:52:15.618865 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee\": container with ID starting with e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee not found: ID does not exist" containerID="e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.618936 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee"} err="failed to get container status \"e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee\": rpc error: code = NotFound desc = could not find container \"e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee\": container with ID starting with e1ee5ceac69aaaa6373e4eb66d493e9694d21d0bae64ef572cbd742e8f6dbcee not found: ID does not exist" Jan 05 13:52:15 crc kubenswrapper[4740]: I0105 13:52:15.621886 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-547b88d74f-q6php"] Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.438314 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch"] Jan 05 13:52:16 crc kubenswrapper[4740]: E0105 13:52:16.438797 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab275941-b160-4fac-9026-dc7476466272" containerName="route-controller-manager" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.438827 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab275941-b160-4fac-9026-dc7476466272" containerName="route-controller-manager" Jan 05 13:52:16 crc kubenswrapper[4740]: E0105 13:52:16.438857 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f711b0b3-8883-4012-8cbc-ef337e8c074d" containerName="controller-manager" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.438873 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f711b0b3-8883-4012-8cbc-ef337e8c074d" containerName="controller-manager" Jan 05 13:52:16 crc kubenswrapper[4740]: E0105 13:52:16.438913 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.438929 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.439195 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab275941-b160-4fac-9026-dc7476466272" containerName="route-controller-manager" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.439226 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.439258 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f711b0b3-8883-4012-8cbc-ef337e8c074d" containerName="controller-manager" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.440100 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.442114 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7"] Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.443160 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.447960 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.452006 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.456387 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.457549 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.460400 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.463677 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.467495 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.471729 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.483105 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.487438 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.487544 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.488994 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.495704 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.506316 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7"] Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.513246 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch"] Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.606307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-client-ca\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.606380 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-config\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.606427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-config\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.606479 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lff5d\" (UniqueName: \"kubernetes.io/projected/ae52778a-9848-4a7f-88a6-bc830c60d2df-kube-api-access-lff5d\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.606749 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6njt\" (UniqueName: \"kubernetes.io/projected/9a31ea5a-c063-4e21-b89f-7330a840305f-kube-api-access-c6njt\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.606913 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-proxy-ca-bundles\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.607139 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae52778a-9848-4a7f-88a6-bc830c60d2df-serving-cert\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.607218 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-client-ca\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.607292 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a31ea5a-c063-4e21-b89f-7330a840305f-serving-cert\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.709787 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-client-ca\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.709842 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-config\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.709868 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-config\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.709892 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lff5d\" (UniqueName: \"kubernetes.io/projected/ae52778a-9848-4a7f-88a6-bc830c60d2df-kube-api-access-lff5d\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.709987 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6njt\" (UniqueName: \"kubernetes.io/projected/9a31ea5a-c063-4e21-b89f-7330a840305f-kube-api-access-c6njt\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.710537 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-proxy-ca-bundles\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.710578 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae52778a-9848-4a7f-88a6-bc830c60d2df-serving-cert\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.710641 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-client-ca\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.710868 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a31ea5a-c063-4e21-b89f-7330a840305f-serving-cert\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.711905 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-client-ca\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.713333 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-config\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.713950 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-proxy-ca-bundles\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.715995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-config\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.717796 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-client-ca\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.718374 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a31ea5a-c063-4e21-b89f-7330a840305f-serving-cert\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.720004 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae52778a-9848-4a7f-88a6-bc830c60d2df-serving-cert\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.741582 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6njt\" (UniqueName: \"kubernetes.io/projected/9a31ea5a-c063-4e21-b89f-7330a840305f-kube-api-access-c6njt\") pod \"route-controller-manager-bf4f87789-69bk7\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.743296 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lff5d\" (UniqueName: \"kubernetes.io/projected/ae52778a-9848-4a7f-88a6-bc830c60d2df-kube-api-access-lff5d\") pod \"controller-manager-6cdbbbb5bc-gsgch\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.795285 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.813638 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.978430 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab275941-b160-4fac-9026-dc7476466272" path="/var/lib/kubelet/pods/ab275941-b160-4fac-9026-dc7476466272/volumes" Jan 05 13:52:16 crc kubenswrapper[4740]: I0105 13:52:16.980666 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f711b0b3-8883-4012-8cbc-ef337e8c074d" path="/var/lib/kubelet/pods/f711b0b3-8883-4012-8cbc-ef337e8c074d/volumes" Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.070319 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7"] Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.282944 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch"] Jan 05 13:52:17 crc kubenswrapper[4740]: W0105 13:52:17.290266 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae52778a_9848_4a7f_88a6_bc830c60d2df.slice/crio-2f24375cc1ff0e445009612f0429dac968c352beb7fb8733c1a4a2417938ad50 WatchSource:0}: Error finding container 2f24375cc1ff0e445009612f0429dac968c352beb7fb8733c1a4a2417938ad50: Status 404 returned error can't find the container with id 2f24375cc1ff0e445009612f0429dac968c352beb7fb8733c1a4a2417938ad50 Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.582997 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" event={"ID":"9a31ea5a-c063-4e21-b89f-7330a840305f","Type":"ContainerStarted","Data":"d731a7c6c7e97d7ab3adc71da4fceb5d0c1a13d77aeb51f829daf3d1bf402195"} Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.583056 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" event={"ID":"9a31ea5a-c063-4e21-b89f-7330a840305f","Type":"ContainerStarted","Data":"6d15022c36c21141082eb648fb55142b6943110eab3f664f24eeda2e5a60aa57"} Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.583091 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.586027 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" event={"ID":"ae52778a-9848-4a7f-88a6-bc830c60d2df","Type":"ContainerStarted","Data":"a10d93a629a2169c3fe301562e9272096ed671d86091462dfd30972b0228f51c"} Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.586144 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" event={"ID":"ae52778a-9848-4a7f-88a6-bc830c60d2df","Type":"ContainerStarted","Data":"2f24375cc1ff0e445009612f0429dac968c352beb7fb8733c1a4a2417938ad50"} Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.586323 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.600962 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.613727 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" podStartSLOduration=3.613709334 podStartE2EDuration="3.613709334s" podCreationTimestamp="2026-01-05 13:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:17.612295437 +0000 UTC m=+186.919204046" watchObservedRunningTime="2026-01-05 13:52:17.613709334 +0000 UTC m=+186.920617913" Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.637907 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" podStartSLOduration=3.63786902 podStartE2EDuration="3.63786902s" podCreationTimestamp="2026-01-05 13:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:17.633165176 +0000 UTC m=+186.940073765" watchObservedRunningTime="2026-01-05 13:52:17.63786902 +0000 UTC m=+186.944777609" Jan 05 13:52:17 crc kubenswrapper[4740]: I0105 13:52:17.788993 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:25 crc kubenswrapper[4740]: I0105 13:52:25.968712 4740 scope.go:117] "RemoveContainer" containerID="03f2329523718d7862f45df79cf862352eff9e792d9973b2e22278b97dbb9215" Jan 05 13:52:26 crc kubenswrapper[4740]: I0105 13:52:26.636637 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-68b95d957b-fwtrs_4e0355b5-87f3-4eeb-b13f-8287d9cb0786/oauth-openshift/1.log" Jan 05 13:52:26 crc kubenswrapper[4740]: I0105 13:52:26.636989 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" event={"ID":"4e0355b5-87f3-4eeb-b13f-8287d9cb0786","Type":"ContainerStarted","Data":"fd8427989db694307d5507c031987ac5ec6208cc94bb12244b362b37e23f2382"} Jan 05 13:52:26 crc kubenswrapper[4740]: I0105 13:52:26.637345 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:26 crc kubenswrapper[4740]: I0105 13:52:26.774512 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.300203 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6"] Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.302944 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.304911 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6"] Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.305768 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.306689 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.307054 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.307638 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.307985 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.371225 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/925301ec-8580-4c5b-b7b1-f5e10063789f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.371274 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/925301ec-8580-4c5b-b7b1-f5e10063789f-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.371303 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59zc6\" (UniqueName: \"kubernetes.io/projected/925301ec-8580-4c5b-b7b1-f5e10063789f-kube-api-access-59zc6\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.472785 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/925301ec-8580-4c5b-b7b1-f5e10063789f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.472843 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/925301ec-8580-4c5b-b7b1-f5e10063789f-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.472883 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59zc6\" (UniqueName: \"kubernetes.io/projected/925301ec-8580-4c5b-b7b1-f5e10063789f-kube-api-access-59zc6\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.474149 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/925301ec-8580-4c5b-b7b1-f5e10063789f-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.480617 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/925301ec-8580-4c5b-b7b1-f5e10063789f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.489081 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59zc6\" (UniqueName: \"kubernetes.io/projected/925301ec-8580-4c5b-b7b1-f5e10063789f-kube-api-access-59zc6\") pod \"cluster-monitoring-operator-6d5b84845-kdlm6\" (UID: \"925301ec-8580-4c5b-b7b1-f5e10063789f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:28 crc kubenswrapper[4740]: I0105 13:52:28.660057 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" Jan 05 13:52:29 crc kubenswrapper[4740]: I0105 13:52:29.073818 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6"] Jan 05 13:52:29 crc kubenswrapper[4740]: W0105 13:52:29.088382 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925301ec_8580_4c5b_b7b1_f5e10063789f.slice/crio-8d732878a85b05fb548ad11b2fd922c2446085cd02debdb2d11b77daadd482c2 WatchSource:0}: Error finding container 8d732878a85b05fb548ad11b2fd922c2446085cd02debdb2d11b77daadd482c2: Status 404 returned error can't find the container with id 8d732878a85b05fb548ad11b2fd922c2446085cd02debdb2d11b77daadd482c2 Jan 05 13:52:29 crc kubenswrapper[4740]: I0105 13:52:29.656805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" event={"ID":"925301ec-8580-4c5b-b7b1-f5e10063789f","Type":"ContainerStarted","Data":"8d732878a85b05fb548ad11b2fd922c2446085cd02debdb2d11b77daadd482c2"} Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.466837 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc"] Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.467707 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.469743 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.469902 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-kshq2" Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.482049 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc"] Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.511319 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.612553 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:31 crc kubenswrapper[4740]: E0105 13:52:31.612672 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:31 crc kubenswrapper[4740]: E0105 13:52:31.612724 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:52:32.112708648 +0000 UTC m=+201.419617227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.674774 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" event={"ID":"925301ec-8580-4c5b-b7b1-f5e10063789f","Type":"ContainerStarted","Data":"a657325655e61c6fe4acbd831bc53744dfcc662a657bd1fb05a9ce7e19c24912"} Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.696215 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kdlm6" podStartSLOduration=1.923277664 podStartE2EDuration="3.696188541s" podCreationTimestamp="2026-01-05 13:52:28 +0000 UTC" firstStartedPulling="2026-01-05 13:52:29.091798966 +0000 UTC m=+198.398707585" lastFinishedPulling="2026-01-05 13:52:30.864709883 +0000 UTC m=+200.171618462" observedRunningTime="2026-01-05 13:52:31.695760319 +0000 UTC m=+201.002668978" watchObservedRunningTime="2026-01-05 13:52:31.696188541 +0000 UTC m=+201.003097160" Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.916578 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:52:31 crc kubenswrapper[4740]: I0105 13:52:31.916706 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:52:32 crc kubenswrapper[4740]: I0105 13:52:32.119707 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:32 crc kubenswrapper[4740]: E0105 13:52:32.119891 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:32 crc kubenswrapper[4740]: E0105 13:52:32.119983 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:52:33.119959248 +0000 UTC m=+202.426867857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:33 crc kubenswrapper[4740]: I0105 13:52:33.133293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:33 crc kubenswrapper[4740]: E0105 13:52:33.133530 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:33 crc kubenswrapper[4740]: E0105 13:52:33.133665 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:52:35.133633605 +0000 UTC m=+204.440542214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.512147 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch"] Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.512737 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" podUID="ae52778a-9848-4a7f-88a6-bc830c60d2df" containerName="controller-manager" containerID="cri-o://a10d93a629a2169c3fe301562e9272096ed671d86091462dfd30972b0228f51c" gracePeriod=30 Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.523978 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7"] Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.524242 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" podUID="9a31ea5a-c063-4e21-b89f-7330a840305f" containerName="route-controller-manager" containerID="cri-o://d731a7c6c7e97d7ab3adc71da4fceb5d0c1a13d77aeb51f829daf3d1bf402195" gracePeriod=30 Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.693944 4740 generic.go:334] "Generic (PLEG): container finished" podID="9a31ea5a-c063-4e21-b89f-7330a840305f" containerID="d731a7c6c7e97d7ab3adc71da4fceb5d0c1a13d77aeb51f829daf3d1bf402195" exitCode=0 Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.693998 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" event={"ID":"9a31ea5a-c063-4e21-b89f-7330a840305f","Type":"ContainerDied","Data":"d731a7c6c7e97d7ab3adc71da4fceb5d0c1a13d77aeb51f829daf3d1bf402195"} Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.695535 4740 generic.go:334] "Generic (PLEG): container finished" podID="ae52778a-9848-4a7f-88a6-bc830c60d2df" containerID="a10d93a629a2169c3fe301562e9272096ed671d86091462dfd30972b0228f51c" exitCode=0 Jan 05 13:52:34 crc kubenswrapper[4740]: I0105 13:52:34.695565 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" event={"ID":"ae52778a-9848-4a7f-88a6-bc830c60d2df","Type":"ContainerDied","Data":"a10d93a629a2169c3fe301562e9272096ed671d86091462dfd30972b0228f51c"} Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.035970 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.166534 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a31ea5a-c063-4e21-b89f-7330a840305f-serving-cert\") pod \"9a31ea5a-c063-4e21-b89f-7330a840305f\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.166930 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6njt\" (UniqueName: \"kubernetes.io/projected/9a31ea5a-c063-4e21-b89f-7330a840305f-kube-api-access-c6njt\") pod \"9a31ea5a-c063-4e21-b89f-7330a840305f\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.166972 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-config\") pod \"9a31ea5a-c063-4e21-b89f-7330a840305f\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.166999 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-client-ca\") pod \"9a31ea5a-c063-4e21-b89f-7330a840305f\" (UID: \"9a31ea5a-c063-4e21-b89f-7330a840305f\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.167219 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:35 crc kubenswrapper[4740]: E0105 13:52:35.167330 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:35 crc kubenswrapper[4740]: E0105 13:52:35.167385 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:52:39.167371934 +0000 UTC m=+208.474280513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.167796 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-client-ca" (OuterVolumeSpecName: "client-ca") pod "9a31ea5a-c063-4e21-b89f-7330a840305f" (UID: "9a31ea5a-c063-4e21-b89f-7330a840305f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.168111 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-config" (OuterVolumeSpecName: "config") pod "9a31ea5a-c063-4e21-b89f-7330a840305f" (UID: "9a31ea5a-c063-4e21-b89f-7330a840305f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.171761 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a31ea5a-c063-4e21-b89f-7330a840305f-kube-api-access-c6njt" (OuterVolumeSpecName: "kube-api-access-c6njt") pod "9a31ea5a-c063-4e21-b89f-7330a840305f" (UID: "9a31ea5a-c063-4e21-b89f-7330a840305f"). InnerVolumeSpecName "kube-api-access-c6njt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.171767 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a31ea5a-c063-4e21-b89f-7330a840305f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9a31ea5a-c063-4e21-b89f-7330a840305f" (UID: "9a31ea5a-c063-4e21-b89f-7330a840305f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.203581 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.268473 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a31ea5a-c063-4e21-b89f-7330a840305f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.268524 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6njt\" (UniqueName: \"kubernetes.io/projected/9a31ea5a-c063-4e21-b89f-7330a840305f-kube-api-access-c6njt\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.268542 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.268560 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a31ea5a-c063-4e21-b89f-7330a840305f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.369649 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-proxy-ca-bundles\") pod \"ae52778a-9848-4a7f-88a6-bc830c60d2df\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.369798 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-config\") pod \"ae52778a-9848-4a7f-88a6-bc830c60d2df\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.369847 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae52778a-9848-4a7f-88a6-bc830c60d2df-serving-cert\") pod \"ae52778a-9848-4a7f-88a6-bc830c60d2df\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.369906 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-client-ca\") pod \"ae52778a-9848-4a7f-88a6-bc830c60d2df\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.370043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lff5d\" (UniqueName: \"kubernetes.io/projected/ae52778a-9848-4a7f-88a6-bc830c60d2df-kube-api-access-lff5d\") pod \"ae52778a-9848-4a7f-88a6-bc830c60d2df\" (UID: \"ae52778a-9848-4a7f-88a6-bc830c60d2df\") " Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.370679 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-config" (OuterVolumeSpecName: "config") pod "ae52778a-9848-4a7f-88a6-bc830c60d2df" (UID: "ae52778a-9848-4a7f-88a6-bc830c60d2df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.370827 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae52778a-9848-4a7f-88a6-bc830c60d2df" (UID: "ae52778a-9848-4a7f-88a6-bc830c60d2df"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.370950 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae52778a-9848-4a7f-88a6-bc830c60d2df" (UID: "ae52778a-9848-4a7f-88a6-bc830c60d2df"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.374337 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae52778a-9848-4a7f-88a6-bc830c60d2df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae52778a-9848-4a7f-88a6-bc830c60d2df" (UID: "ae52778a-9848-4a7f-88a6-bc830c60d2df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.374608 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae52778a-9848-4a7f-88a6-bc830c60d2df-kube-api-access-lff5d" (OuterVolumeSpecName: "kube-api-access-lff5d") pod "ae52778a-9848-4a7f-88a6-bc830c60d2df" (UID: "ae52778a-9848-4a7f-88a6-bc830c60d2df"). InnerVolumeSpecName "kube-api-access-lff5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.477943 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.478018 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.478044 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae52778a-9848-4a7f-88a6-bc830c60d2df-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.478095 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae52778a-9848-4a7f-88a6-bc830c60d2df-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.478121 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lff5d\" (UniqueName: \"kubernetes.io/projected/ae52778a-9848-4a7f-88a6-bc830c60d2df-kube-api-access-lff5d\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.702532 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" event={"ID":"9a31ea5a-c063-4e21-b89f-7330a840305f","Type":"ContainerDied","Data":"6d15022c36c21141082eb648fb55142b6943110eab3f664f24eeda2e5a60aa57"} Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.702601 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.702873 4740 scope.go:117] "RemoveContainer" containerID="d731a7c6c7e97d7ab3adc71da4fceb5d0c1a13d77aeb51f829daf3d1bf402195" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.704497 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" event={"ID":"ae52778a-9848-4a7f-88a6-bc830c60d2df","Type":"ContainerDied","Data":"2f24375cc1ff0e445009612f0429dac968c352beb7fb8733c1a4a2417938ad50"} Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.704549 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.725142 4740 scope.go:117] "RemoveContainer" containerID="a10d93a629a2169c3fe301562e9272096ed671d86091462dfd30972b0228f51c" Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.744781 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch"] Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.750155 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-gsgch"] Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.753224 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7"] Jan 05 13:52:35 crc kubenswrapper[4740]: I0105 13:52:35.755682 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-69bk7"] Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.453422 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58c887dd9d-wq6r9"] Jan 05 13:52:36 crc kubenswrapper[4740]: E0105 13:52:36.453827 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a31ea5a-c063-4e21-b89f-7330a840305f" containerName="route-controller-manager" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.453861 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a31ea5a-c063-4e21-b89f-7330a840305f" containerName="route-controller-manager" Jan 05 13:52:36 crc kubenswrapper[4740]: E0105 13:52:36.453893 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae52778a-9848-4a7f-88a6-bc830c60d2df" containerName="controller-manager" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.453910 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae52778a-9848-4a7f-88a6-bc830c60d2df" containerName="controller-manager" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.454154 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a31ea5a-c063-4e21-b89f-7330a840305f" containerName="route-controller-manager" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.454198 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae52778a-9848-4a7f-88a6-bc830c60d2df" containerName="controller-manager" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.454927 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.459969 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.461111 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.461377 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.461621 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.462432 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.462997 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.469458 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d"] Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.472979 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.475485 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.478380 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.478921 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.479298 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.479610 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.480117 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.480340 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.488371 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d"] Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491367 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15f088bc-d865-4746-a0ab-bd11b7cdbc67-serving-cert\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491498 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-config\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491549 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-proxy-ca-bundles\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491582 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-config\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491645 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cdsm\" (UniqueName: \"kubernetes.io/projected/15f088bc-d865-4746-a0ab-bd11b7cdbc67-kube-api-access-5cdsm\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491745 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-client-ca\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491781 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f00b825-8d7b-406a-9a52-710436e38e99-serving-cert\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491826 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-client-ca\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.491870 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfsgr\" (UniqueName: \"kubernetes.io/projected/6f00b825-8d7b-406a-9a52-710436e38e99-kube-api-access-nfsgr\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.493431 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58c887dd9d-wq6r9"] Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593600 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-config\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593674 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-proxy-ca-bundles\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593717 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-config\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593769 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cdsm\" (UniqueName: \"kubernetes.io/projected/15f088bc-d865-4746-a0ab-bd11b7cdbc67-kube-api-access-5cdsm\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593836 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-client-ca\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593885 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f00b825-8d7b-406a-9a52-710436e38e99-serving-cert\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.593954 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-client-ca\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.594092 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfsgr\" (UniqueName: \"kubernetes.io/projected/6f00b825-8d7b-406a-9a52-710436e38e99-kube-api-access-nfsgr\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.594199 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15f088bc-d865-4746-a0ab-bd11b7cdbc67-serving-cert\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.595405 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-client-ca\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.595598 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-client-ca\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.596518 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-proxy-ca-bundles\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.596705 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-config\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.597021 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-config\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.600353 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f00b825-8d7b-406a-9a52-710436e38e99-serving-cert\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.600396 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15f088bc-d865-4746-a0ab-bd11b7cdbc67-serving-cert\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.624602 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cdsm\" (UniqueName: \"kubernetes.io/projected/15f088bc-d865-4746-a0ab-bd11b7cdbc67-kube-api-access-5cdsm\") pod \"route-controller-manager-68f67d5b6d-lm75d\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.638180 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfsgr\" (UniqueName: \"kubernetes.io/projected/6f00b825-8d7b-406a-9a52-710436e38e99-kube-api-access-nfsgr\") pod \"controller-manager-58c887dd9d-wq6r9\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.790938 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.804000 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.982197 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a31ea5a-c063-4e21-b89f-7330a840305f" path="/var/lib/kubelet/pods/9a31ea5a-c063-4e21-b89f-7330a840305f/volumes" Jan 05 13:52:36 crc kubenswrapper[4740]: I0105 13:52:36.985039 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae52778a-9848-4a7f-88a6-bc830c60d2df" path="/var/lib/kubelet/pods/ae52778a-9848-4a7f-88a6-bc830c60d2df/volumes" Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.128635 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d"] Jan 05 13:52:37 crc kubenswrapper[4740]: W0105 13:52:37.133174 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f088bc_d865_4746_a0ab_bd11b7cdbc67.slice/crio-d017b69d211b25c203081520a4ea165ebb140b1c679a7554f3187c53977433d7 WatchSource:0}: Error finding container d017b69d211b25c203081520a4ea165ebb140b1c679a7554f3187c53977433d7: Status 404 returned error can't find the container with id d017b69d211b25c203081520a4ea165ebb140b1c679a7554f3187c53977433d7 Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.279450 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58c887dd9d-wq6r9"] Jan 05 13:52:37 crc kubenswrapper[4740]: W0105 13:52:37.286935 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f00b825_8d7b_406a_9a52_710436e38e99.slice/crio-f72fb2b46167338ccbf08499748a0e6a49174e057ea28a50d931ab101467812e WatchSource:0}: Error finding container f72fb2b46167338ccbf08499748a0e6a49174e057ea28a50d931ab101467812e: Status 404 returned error can't find the container with id f72fb2b46167338ccbf08499748a0e6a49174e057ea28a50d931ab101467812e Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.721924 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" event={"ID":"15f088bc-d865-4746-a0ab-bd11b7cdbc67","Type":"ContainerStarted","Data":"b2b7bb4af1f562f071ad392ebf4cdcda086c6f2dae3ab3d0985388d220cbd20e"} Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.723011 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" event={"ID":"15f088bc-d865-4746-a0ab-bd11b7cdbc67","Type":"ContainerStarted","Data":"d017b69d211b25c203081520a4ea165ebb140b1c679a7554f3187c53977433d7"} Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.723035 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.723126 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" event={"ID":"6f00b825-8d7b-406a-9a52-710436e38e99","Type":"ContainerStarted","Data":"ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa"} Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.723160 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" event={"ID":"6f00b825-8d7b-406a-9a52-710436e38e99","Type":"ContainerStarted","Data":"f72fb2b46167338ccbf08499748a0e6a49174e057ea28a50d931ab101467812e"} Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.723389 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.728107 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.741612 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" podStartSLOduration=3.7415926280000003 podStartE2EDuration="3.741592628s" podCreationTimestamp="2026-01-05 13:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:37.738277788 +0000 UTC m=+207.045186367" watchObservedRunningTime="2026-01-05 13:52:37.741592628 +0000 UTC m=+207.048501197" Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.765748 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" podStartSLOduration=3.765715755 podStartE2EDuration="3.765715755s" podCreationTimestamp="2026-01-05 13:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:37.761975233 +0000 UTC m=+207.068883812" watchObservedRunningTime="2026-01-05 13:52:37.765715755 +0000 UTC m=+207.072624324" Jan 05 13:52:37 crc kubenswrapper[4740]: I0105 13:52:37.903554 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:39 crc kubenswrapper[4740]: I0105 13:52:39.230322 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:39 crc kubenswrapper[4740]: E0105 13:52:39.230634 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:39 crc kubenswrapper[4740]: E0105 13:52:39.230764 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:52:47.230734771 +0000 UTC m=+216.537643390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:47 crc kubenswrapper[4740]: I0105 13:52:47.254676 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:52:47 crc kubenswrapper[4740]: E0105 13:52:47.254924 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:47 crc kubenswrapper[4740]: E0105 13:52:47.255505 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:53:03.255479858 +0000 UTC m=+232.562388477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:52:54 crc kubenswrapper[4740]: I0105 13:52:54.579697 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d"] Jan 05 13:52:54 crc kubenswrapper[4740]: I0105 13:52:54.580293 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" podUID="15f088bc-d865-4746-a0ab-bd11b7cdbc67" containerName="route-controller-manager" containerID="cri-o://b2b7bb4af1f562f071ad392ebf4cdcda086c6f2dae3ab3d0985388d220cbd20e" gracePeriod=30 Jan 05 13:52:54 crc kubenswrapper[4740]: I0105 13:52:54.836881 4740 generic.go:334] "Generic (PLEG): container finished" podID="15f088bc-d865-4746-a0ab-bd11b7cdbc67" containerID="b2b7bb4af1f562f071ad392ebf4cdcda086c6f2dae3ab3d0985388d220cbd20e" exitCode=0 Jan 05 13:52:54 crc kubenswrapper[4740]: I0105 13:52:54.837088 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" event={"ID":"15f088bc-d865-4746-a0ab-bd11b7cdbc67","Type":"ContainerDied","Data":"b2b7bb4af1f562f071ad392ebf4cdcda086c6f2dae3ab3d0985388d220cbd20e"} Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.619952 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.656205 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768"] Jan 05 13:52:55 crc kubenswrapper[4740]: E0105 13:52:55.657184 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f088bc-d865-4746-a0ab-bd11b7cdbc67" containerName="route-controller-manager" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.657216 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f088bc-d865-4746-a0ab-bd11b7cdbc67" containerName="route-controller-manager" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.657398 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f088bc-d865-4746-a0ab-bd11b7cdbc67" containerName="route-controller-manager" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.657792 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.664013 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768"] Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.670467 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cdsm\" (UniqueName: \"kubernetes.io/projected/15f088bc-d865-4746-a0ab-bd11b7cdbc67-kube-api-access-5cdsm\") pod \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.670518 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15f088bc-d865-4746-a0ab-bd11b7cdbc67-serving-cert\") pod \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.670613 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-config\") pod \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.670634 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-client-ca\") pod \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\" (UID: \"15f088bc-d865-4746-a0ab-bd11b7cdbc67\") " Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.671620 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-client-ca" (OuterVolumeSpecName: "client-ca") pod "15f088bc-d865-4746-a0ab-bd11b7cdbc67" (UID: "15f088bc-d865-4746-a0ab-bd11b7cdbc67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.671933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-config" (OuterVolumeSpecName: "config") pod "15f088bc-d865-4746-a0ab-bd11b7cdbc67" (UID: "15f088bc-d865-4746-a0ab-bd11b7cdbc67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.677848 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f088bc-d865-4746-a0ab-bd11b7cdbc67-kube-api-access-5cdsm" (OuterVolumeSpecName: "kube-api-access-5cdsm") pod "15f088bc-d865-4746-a0ab-bd11b7cdbc67" (UID: "15f088bc-d865-4746-a0ab-bd11b7cdbc67"). InnerVolumeSpecName "kube-api-access-5cdsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.678706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f088bc-d865-4746-a0ab-bd11b7cdbc67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "15f088bc-d865-4746-a0ab-bd11b7cdbc67" (UID: "15f088bc-d865-4746-a0ab-bd11b7cdbc67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.772480 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4272a46-6424-418e-baf7-dba25f1813c4-config\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.772524 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtlgq\" (UniqueName: \"kubernetes.io/projected/f4272a46-6424-418e-baf7-dba25f1813c4-kube-api-access-gtlgq\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.772670 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4272a46-6424-418e-baf7-dba25f1813c4-client-ca\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.772846 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4272a46-6424-418e-baf7-dba25f1813c4-serving-cert\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.773034 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.773059 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15f088bc-d865-4746-a0ab-bd11b7cdbc67-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.773087 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cdsm\" (UniqueName: \"kubernetes.io/projected/15f088bc-d865-4746-a0ab-bd11b7cdbc67-kube-api-access-5cdsm\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.773098 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15f088bc-d865-4746-a0ab-bd11b7cdbc67-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.847822 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" event={"ID":"15f088bc-d865-4746-a0ab-bd11b7cdbc67","Type":"ContainerDied","Data":"d017b69d211b25c203081520a4ea165ebb140b1c679a7554f3187c53977433d7"} Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.847888 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.847898 4740 scope.go:117] "RemoveContainer" containerID="b2b7bb4af1f562f071ad392ebf4cdcda086c6f2dae3ab3d0985388d220cbd20e" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.874480 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4272a46-6424-418e-baf7-dba25f1813c4-config\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.874550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtlgq\" (UniqueName: \"kubernetes.io/projected/f4272a46-6424-418e-baf7-dba25f1813c4-kube-api-access-gtlgq\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.874597 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4272a46-6424-418e-baf7-dba25f1813c4-client-ca\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.874651 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4272a46-6424-418e-baf7-dba25f1813c4-serving-cert\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.876549 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4272a46-6424-418e-baf7-dba25f1813c4-config\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.876734 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4272a46-6424-418e-baf7-dba25f1813c4-client-ca\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.880962 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4272a46-6424-418e-baf7-dba25f1813c4-serving-cert\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.895705 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d"] Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.898690 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtlgq\" (UniqueName: \"kubernetes.io/projected/f4272a46-6424-418e-baf7-dba25f1813c4-kube-api-access-gtlgq\") pod \"route-controller-manager-bf4f87789-2p768\" (UID: \"f4272a46-6424-418e-baf7-dba25f1813c4\") " pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:55 crc kubenswrapper[4740]: I0105 13:52:55.901977 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68f67d5b6d-lm75d"] Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.009622 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.468624 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768"] Jan 05 13:52:56 crc kubenswrapper[4740]: W0105 13:52:56.478339 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4272a46_6424_418e_baf7_dba25f1813c4.slice/crio-c4fe8cd897eaa4fd3108b5535cae16e7261a98a01bc7fa80c6c94ad5c73d97e7 WatchSource:0}: Error finding container c4fe8cd897eaa4fd3108b5535cae16e7261a98a01bc7fa80c6c94ad5c73d97e7: Status 404 returned error can't find the container with id c4fe8cd897eaa4fd3108b5535cae16e7261a98a01bc7fa80c6c94ad5c73d97e7 Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.858448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" event={"ID":"f4272a46-6424-418e-baf7-dba25f1813c4","Type":"ContainerStarted","Data":"fe10947fde16ca0dc4ddddad37e7043c438752dd275cb3f058bb1d9132ca7eaf"} Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.858653 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.858669 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" event={"ID":"f4272a46-6424-418e-baf7-dba25f1813c4","Type":"ContainerStarted","Data":"c4fe8cd897eaa4fd3108b5535cae16e7261a98a01bc7fa80c6c94ad5c73d97e7"} Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.894571 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podStartSLOduration=2.894534747 podStartE2EDuration="2.894534747s" podCreationTimestamp="2026-01-05 13:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:52:56.889783977 +0000 UTC m=+226.196692586" watchObservedRunningTime="2026-01-05 13:52:56.894534747 +0000 UTC m=+226.201443376" Jan 05 13:52:56 crc kubenswrapper[4740]: I0105 13:52:56.979380 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f088bc-d865-4746-a0ab-bd11b7cdbc67" path="/var/lib/kubelet/pods/15f088bc-d865-4746-a0ab-bd11b7cdbc67/volumes" Jan 05 13:52:57 crc kubenswrapper[4740]: I0105 13:52:57.123830 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 13:53:01 crc kubenswrapper[4740]: I0105 13:53:01.916196 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:53:01 crc kubenswrapper[4740]: I0105 13:53:01.916649 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:53:01 crc kubenswrapper[4740]: I0105 13:53:01.916727 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:53:01 crc kubenswrapper[4740]: I0105 13:53:01.917661 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce8190df163bf1923ad03250cabf835a3e8f9ecb64484dd6a124c97fc8435ba8"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 13:53:01 crc kubenswrapper[4740]: I0105 13:53:01.917790 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://ce8190df163bf1923ad03250cabf835a3e8f9ecb64484dd6a124c97fc8435ba8" gracePeriod=600 Jan 05 13:53:02 crc kubenswrapper[4740]: I0105 13:53:02.899980 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="ce8190df163bf1923ad03250cabf835a3e8f9ecb64484dd6a124c97fc8435ba8" exitCode=0 Jan 05 13:53:02 crc kubenswrapper[4740]: I0105 13:53:02.900044 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"ce8190df163bf1923ad03250cabf835a3e8f9ecb64484dd6a124c97fc8435ba8"} Jan 05 13:53:02 crc kubenswrapper[4740]: I0105 13:53:02.900679 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"73243fe427b3c563b811bef4fe47899b7220b055a5ab1889d2817322cf522b18"} Jan 05 13:53:03 crc kubenswrapper[4740]: I0105 13:53:03.276849 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:53:03 crc kubenswrapper[4740]: E0105 13:53:03.277137 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:53:03 crc kubenswrapper[4740]: E0105 13:53:03.277255 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:53:35.277228168 +0000 UTC m=+264.584136777 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:53:23 crc kubenswrapper[4740]: I0105 13:53:23.901733 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dwcfs"] Jan 05 13:53:23 crc kubenswrapper[4740]: I0105 13:53:23.907788 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:23 crc kubenswrapper[4740]: I0105 13:53:23.913493 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 05 13:53:23 crc kubenswrapper[4740]: I0105 13:53:23.916615 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwcfs"] Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.019823 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-catalog-content\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.019894 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-utilities\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.020126 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjll\" (UniqueName: \"kubernetes.io/projected/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-kube-api-access-8jjll\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.098652 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tj7tn"] Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.102141 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.104579 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.107774 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tj7tn"] Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.121742 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-catalog-content\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.121781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-utilities\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.121812 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjll\" (UniqueName: \"kubernetes.io/projected/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-kube-api-access-8jjll\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.122721 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-catalog-content\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.122762 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-utilities\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.140619 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjll\" (UniqueName: \"kubernetes.io/projected/b023e432-3b4e-4161-bfcc-b5d8b601e9d5-kube-api-access-8jjll\") pod \"certified-operators-dwcfs\" (UID: \"b023e432-3b4e-4161-bfcc-b5d8b601e9d5\") " pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.223385 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df26838-83be-4000-b37e-841a0457717b-catalog-content\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.223438 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959gp\" (UniqueName: \"kubernetes.io/projected/4df26838-83be-4000-b37e-841a0457717b-kube-api-access-959gp\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.223505 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df26838-83be-4000-b37e-841a0457717b-utilities\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.227229 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.324914 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df26838-83be-4000-b37e-841a0457717b-utilities\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.325001 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df26838-83be-4000-b37e-841a0457717b-catalog-content\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.325022 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959gp\" (UniqueName: \"kubernetes.io/projected/4df26838-83be-4000-b37e-841a0457717b-kube-api-access-959gp\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.325717 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df26838-83be-4000-b37e-841a0457717b-utilities\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.325732 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df26838-83be-4000-b37e-841a0457717b-catalog-content\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.342318 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959gp\" (UniqueName: \"kubernetes.io/projected/4df26838-83be-4000-b37e-841a0457717b-kube-api-access-959gp\") pod \"community-operators-tj7tn\" (UID: \"4df26838-83be-4000-b37e-841a0457717b\") " pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.421621 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.624223 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gflkl"] Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.624949 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.641697 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gflkl"] Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.689800 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwcfs"] Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731449 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-registry-tls\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731508 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-bound-sa-token\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731545 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a83639-e2ec-4762-873c-63dacaf3aab5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731616 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzrk\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-kube-api-access-qqzrk\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731680 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a83639-e2ec-4762-873c-63dacaf3aab5-registry-certificates\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731701 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a83639-e2ec-4762-873c-63dacaf3aab5-trusted-ca\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731774 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a83639-e2ec-4762-873c-63dacaf3aab5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.731874 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.748949 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.832998 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-bound-sa-token\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a83639-e2ec-4762-873c-63dacaf3aab5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833411 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzrk\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-kube-api-access-qqzrk\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833436 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a83639-e2ec-4762-873c-63dacaf3aab5-registry-certificates\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833460 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a83639-e2ec-4762-873c-63dacaf3aab5-trusted-ca\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a83639-e2ec-4762-873c-63dacaf3aab5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833581 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-registry-tls\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.833913 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68a83639-e2ec-4762-873c-63dacaf3aab5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.835995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a83639-e2ec-4762-873c-63dacaf3aab5-trusted-ca\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.838379 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68a83639-e2ec-4762-873c-63dacaf3aab5-registry-certificates\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.839959 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68a83639-e2ec-4762-873c-63dacaf3aab5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.842186 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-registry-tls\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.855990 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzrk\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-kube-api-access-qqzrk\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.864844 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68a83639-e2ec-4762-873c-63dacaf3aab5-bound-sa-token\") pod \"image-registry-66df7c8f76-gflkl\" (UID: \"68a83639-e2ec-4762-873c-63dacaf3aab5\") " pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.869276 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tj7tn"] Jan 05 13:53:24 crc kubenswrapper[4740]: W0105 13:53:24.926531 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4df26838_83be_4000_b37e_841a0457717b.slice/crio-aec5dea9763e1df27ff49a3c7e70fd6d064fab08ba8bf31ac38d10cedededdee WatchSource:0}: Error finding container aec5dea9763e1df27ff49a3c7e70fd6d064fab08ba8bf31ac38d10cedededdee: Status 404 returned error can't find the container with id aec5dea9763e1df27ff49a3c7e70fd6d064fab08ba8bf31ac38d10cedededdee Jan 05 13:53:24 crc kubenswrapper[4740]: I0105 13:53:24.949316 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:25 crc kubenswrapper[4740]: I0105 13:53:25.071295 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj7tn" event={"ID":"4df26838-83be-4000-b37e-841a0457717b","Type":"ContainerStarted","Data":"78968d32172ce76445ea40b297495c9a16f741a674019aa5cfc32cad7ca50369"} Jan 05 13:53:25 crc kubenswrapper[4740]: I0105 13:53:25.071565 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj7tn" event={"ID":"4df26838-83be-4000-b37e-841a0457717b","Type":"ContainerStarted","Data":"aec5dea9763e1df27ff49a3c7e70fd6d064fab08ba8bf31ac38d10cedededdee"} Jan 05 13:53:25 crc kubenswrapper[4740]: I0105 13:53:25.080753 4740 generic.go:334] "Generic (PLEG): container finished" podID="b023e432-3b4e-4161-bfcc-b5d8b601e9d5" containerID="0e54e5d8bc072345b07c5a93f5f25dda434a6930e700dbf72b7d51fa8a66eefd" exitCode=0 Jan 05 13:53:25 crc kubenswrapper[4740]: I0105 13:53:25.080796 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwcfs" event={"ID":"b023e432-3b4e-4161-bfcc-b5d8b601e9d5","Type":"ContainerDied","Data":"0e54e5d8bc072345b07c5a93f5f25dda434a6930e700dbf72b7d51fa8a66eefd"} Jan 05 13:53:25 crc kubenswrapper[4740]: I0105 13:53:25.080821 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwcfs" event={"ID":"b023e432-3b4e-4161-bfcc-b5d8b601e9d5","Type":"ContainerStarted","Data":"a62314f86845ce0576d24cc4045730b899330b4b615c64380522a23b7ab274b6"} Jan 05 13:53:25 crc kubenswrapper[4740]: W0105 13:53:25.182308 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a83639_e2ec_4762_873c_63dacaf3aab5.slice/crio-f0d67c06f7f582e696693b403e58884cd314216299b58cc15562fc472a622c1b WatchSource:0}: Error finding container f0d67c06f7f582e696693b403e58884cd314216299b58cc15562fc472a622c1b: Status 404 returned error can't find the container with id f0d67c06f7f582e696693b403e58884cd314216299b58cc15562fc472a622c1b Jan 05 13:53:25 crc kubenswrapper[4740]: I0105 13:53:25.183663 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gflkl"] Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.088520 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" event={"ID":"68a83639-e2ec-4762-873c-63dacaf3aab5","Type":"ContainerStarted","Data":"178b122f7452fe6ff6dc76bf9bafb28461e4344dc9019a285f9690a53adda46e"} Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.089252 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" event={"ID":"68a83639-e2ec-4762-873c-63dacaf3aab5","Type":"ContainerStarted","Data":"f0d67c06f7f582e696693b403e58884cd314216299b58cc15562fc472a622c1b"} Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.089363 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.090574 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwcfs" event={"ID":"b023e432-3b4e-4161-bfcc-b5d8b601e9d5","Type":"ContainerStarted","Data":"c6ae1e75ece654c80057df4d0d6be852493b5aac17a36fee5c29112edd683c2b"} Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.092149 4740 generic.go:334] "Generic (PLEG): container finished" podID="4df26838-83be-4000-b37e-841a0457717b" containerID="78968d32172ce76445ea40b297495c9a16f741a674019aa5cfc32cad7ca50369" exitCode=0 Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.092192 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj7tn" event={"ID":"4df26838-83be-4000-b37e-841a0457717b","Type":"ContainerDied","Data":"78968d32172ce76445ea40b297495c9a16f741a674019aa5cfc32cad7ca50369"} Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.106804 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gk89z"] Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.109410 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.111031 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.133078 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" podStartSLOduration=2.133043533 podStartE2EDuration="2.133043533s" podCreationTimestamp="2026-01-05 13:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:53:26.128092578 +0000 UTC m=+255.435001157" watchObservedRunningTime="2026-01-05 13:53:26.133043533 +0000 UTC m=+255.439952112" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.142651 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gk89z"] Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.275186 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-utilities\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.275239 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pbjv\" (UniqueName: \"kubernetes.io/projected/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-kube-api-access-5pbjv\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.275267 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-catalog-content\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.376616 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-utilities\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.376659 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pbjv\" (UniqueName: \"kubernetes.io/projected/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-kube-api-access-5pbjv\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.376680 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-catalog-content\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.377188 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-utilities\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.377244 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-catalog-content\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.400826 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pbjv\" (UniqueName: \"kubernetes.io/projected/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-kube-api-access-5pbjv\") pod \"redhat-marketplace-gk89z\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.426420 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.701815 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7vp5p"] Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.703287 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.705458 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.763486 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7vp5p"] Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.803098 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836c7750-5680-4a56-8947-2df3b121bb3f-catalog-content\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.803163 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2xnm\" (UniqueName: \"kubernetes.io/projected/836c7750-5680-4a56-8947-2df3b121bb3f-kube-api-access-s2xnm\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.803223 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836c7750-5680-4a56-8947-2df3b121bb3f-utilities\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.871209 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gk89z"] Jan 05 13:53:26 crc kubenswrapper[4740]: W0105 13:53:26.877055 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47adddb7_6c5d_4b97_8502_2fe887d9b8dc.slice/crio-7460ffc59b19fee7fce89939349e2a99a4bd5ba746fc48836bb6fde0a1b07081 WatchSource:0}: Error finding container 7460ffc59b19fee7fce89939349e2a99a4bd5ba746fc48836bb6fde0a1b07081: Status 404 returned error can't find the container with id 7460ffc59b19fee7fce89939349e2a99a4bd5ba746fc48836bb6fde0a1b07081 Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.904176 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836c7750-5680-4a56-8947-2df3b121bb3f-catalog-content\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.904226 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2xnm\" (UniqueName: \"kubernetes.io/projected/836c7750-5680-4a56-8947-2df3b121bb3f-kube-api-access-s2xnm\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.904252 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836c7750-5680-4a56-8947-2df3b121bb3f-utilities\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.904755 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836c7750-5680-4a56-8947-2df3b121bb3f-utilities\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.904954 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836c7750-5680-4a56-8947-2df3b121bb3f-catalog-content\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:26 crc kubenswrapper[4740]: I0105 13:53:26.928807 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2xnm\" (UniqueName: \"kubernetes.io/projected/836c7750-5680-4a56-8947-2df3b121bb3f-kube-api-access-s2xnm\") pod \"redhat-operators-7vp5p\" (UID: \"836c7750-5680-4a56-8947-2df3b121bb3f\") " pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:27 crc kubenswrapper[4740]: I0105 13:53:27.098879 4740 generic.go:334] "Generic (PLEG): container finished" podID="b023e432-3b4e-4161-bfcc-b5d8b601e9d5" containerID="c6ae1e75ece654c80057df4d0d6be852493b5aac17a36fee5c29112edd683c2b" exitCode=0 Jan 05 13:53:27 crc kubenswrapper[4740]: I0105 13:53:27.099240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwcfs" event={"ID":"b023e432-3b4e-4161-bfcc-b5d8b601e9d5","Type":"ContainerDied","Data":"c6ae1e75ece654c80057df4d0d6be852493b5aac17a36fee5c29112edd683c2b"} Jan 05 13:53:27 crc kubenswrapper[4740]: I0105 13:53:27.100929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gk89z" event={"ID":"47adddb7-6c5d-4b97-8502-2fe887d9b8dc","Type":"ContainerStarted","Data":"7460ffc59b19fee7fce89939349e2a99a4bd5ba746fc48836bb6fde0a1b07081"} Jan 05 13:53:27 crc kubenswrapper[4740]: I0105 13:53:27.651672 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.108457 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7vp5p"] Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.109266 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwcfs" event={"ID":"b023e432-3b4e-4161-bfcc-b5d8b601e9d5","Type":"ContainerStarted","Data":"49e3c5d55b8417967a5697fac2c60ab436f6f9300d15d5c27645bcad1820bbfb"} Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.111005 4740 generic.go:334] "Generic (PLEG): container finished" podID="4df26838-83be-4000-b37e-841a0457717b" containerID="2c259e52abdeaf435b0b89b681e1ca5eda9e6576a9a5f920b6e2c2864db37bb1" exitCode=0 Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.111111 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj7tn" event={"ID":"4df26838-83be-4000-b37e-841a0457717b","Type":"ContainerDied","Data":"2c259e52abdeaf435b0b89b681e1ca5eda9e6576a9a5f920b6e2c2864db37bb1"} Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.114871 4740 generic.go:334] "Generic (PLEG): container finished" podID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerID="699d26ba0613533b5165f4cc2e4a786d1182455eabc8a807b8c8de0e4a07a15a" exitCode=0 Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.115787 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gk89z" event={"ID":"47adddb7-6c5d-4b97-8502-2fe887d9b8dc","Type":"ContainerDied","Data":"699d26ba0613533b5165f4cc2e4a786d1182455eabc8a807b8c8de0e4a07a15a"} Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.129469 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dwcfs" podStartSLOduration=2.556874096 podStartE2EDuration="5.129442335s" podCreationTimestamp="2026-01-05 13:53:23 +0000 UTC" firstStartedPulling="2026-01-05 13:53:25.085178794 +0000 UTC m=+254.392087373" lastFinishedPulling="2026-01-05 13:53:27.657747033 +0000 UTC m=+256.964655612" observedRunningTime="2026-01-05 13:53:28.126838774 +0000 UTC m=+257.433747343" watchObservedRunningTime="2026-01-05 13:53:28.129442335 +0000 UTC m=+257.436350924" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.499506 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lfp42"] Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.500693 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.516584 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfp42"] Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.644187 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-utilities\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.644313 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-catalog-content\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.644360 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t52df\" (UniqueName: \"kubernetes.io/projected/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-kube-api-access-t52df\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.745305 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-utilities\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.745397 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-catalog-content\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.745432 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t52df\" (UniqueName: \"kubernetes.io/projected/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-kube-api-access-t52df\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.745977 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-catalog-content\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.746624 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-utilities\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.765791 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t52df\" (UniqueName: \"kubernetes.io/projected/f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879-kube-api-access-t52df\") pod \"certified-operators-lfp42\" (UID: \"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879\") " pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:28 crc kubenswrapper[4740]: I0105 13:53:28.848906 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.097916 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g9bwh"] Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.102609 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.116284 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g9bwh"] Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.129395 4740 generic.go:334] "Generic (PLEG): container finished" podID="836c7750-5680-4a56-8947-2df3b121bb3f" containerID="d30250159b725c750d2472fbed2badb5dc8541c988a64c89c0731c30bf431903" exitCode=0 Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.129494 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7vp5p" event={"ID":"836c7750-5680-4a56-8947-2df3b121bb3f","Type":"ContainerDied","Data":"d30250159b725c750d2472fbed2badb5dc8541c988a64c89c0731c30bf431903"} Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.129540 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7vp5p" event={"ID":"836c7750-5680-4a56-8947-2df3b121bb3f","Type":"ContainerStarted","Data":"ae5d10965a6572daf6c88719e807ee66da8eb056ca930502a5e01f050242e78a"} Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.137393 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj7tn" event={"ID":"4df26838-83be-4000-b37e-841a0457717b","Type":"ContainerStarted","Data":"2b357731851aa4a332c644ec27c87cbbe5df7f8fe6f6444933618c4875ac856f"} Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.145982 4740 generic.go:334] "Generic (PLEG): container finished" podID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerID="793cc50c26e1809c9e0365b472f18a62b283b925b321add808b54597a9a16bda" exitCode=0 Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.146822 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gk89z" event={"ID":"47adddb7-6c5d-4b97-8502-2fe887d9b8dc","Type":"ContainerDied","Data":"793cc50c26e1809c9e0365b472f18a62b283b925b321add808b54597a9a16bda"} Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.169796 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tj7tn" podStartSLOduration=2.75992769 podStartE2EDuration="5.169780298s" podCreationTimestamp="2026-01-05 13:53:24 +0000 UTC" firstStartedPulling="2026-01-05 13:53:26.093446065 +0000 UTC m=+255.400354644" lastFinishedPulling="2026-01-05 13:53:28.503298673 +0000 UTC m=+257.810207252" observedRunningTime="2026-01-05 13:53:29.169376557 +0000 UTC m=+258.476285136" watchObservedRunningTime="2026-01-05 13:53:29.169780298 +0000 UTC m=+258.476688877" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.250729 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-utilities\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.250798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-catalog-content\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.251172 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hj9\" (UniqueName: \"kubernetes.io/projected/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-kube-api-access-42hj9\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.323072 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfp42"] Jan 05 13:53:29 crc kubenswrapper[4740]: W0105 13:53:29.331420 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6e2c2ba_abe4_4d56_92c8_4a3f4f5d8879.slice/crio-c1fb40a4a0368897cd9529c0889957d99f7c339227867b00307265fb43a551a1 WatchSource:0}: Error finding container c1fb40a4a0368897cd9529c0889957d99f7c339227867b00307265fb43a551a1: Status 404 returned error can't find the container with id c1fb40a4a0368897cd9529c0889957d99f7c339227867b00307265fb43a551a1 Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.353327 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hj9\" (UniqueName: \"kubernetes.io/projected/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-kube-api-access-42hj9\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.353496 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-utilities\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.353583 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-catalog-content\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.354054 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-utilities\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.354068 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-catalog-content\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.378081 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hj9\" (UniqueName: \"kubernetes.io/projected/ceb39900-d5f8-4d29-b3ec-01a60b2e4378-kube-api-access-42hj9\") pod \"community-operators-g9bwh\" (UID: \"ceb39900-d5f8-4d29-b3ec-01a60b2e4378\") " pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.429301 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:29 crc kubenswrapper[4740]: I0105 13:53:29.637622 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g9bwh"] Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.173364 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7vp5p" event={"ID":"836c7750-5680-4a56-8947-2df3b121bb3f","Type":"ContainerStarted","Data":"8efbd7c65ee3adf250b2c1c2c44f2e23f0d6dec238b5f0ce67bb24f842368513"} Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.175443 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gk89z" event={"ID":"47adddb7-6c5d-4b97-8502-2fe887d9b8dc","Type":"ContainerStarted","Data":"1d873e358bf056c3c71f696db16854df0fe6e58ac27921eefda8d2ff2d3c675e"} Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.177414 4740 generic.go:334] "Generic (PLEG): container finished" podID="ceb39900-d5f8-4d29-b3ec-01a60b2e4378" containerID="70fea575f89497c81409331c97f7e6fd92cd6e3603cbeae052c6aa4bfb593f2f" exitCode=0 Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.177657 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9bwh" event={"ID":"ceb39900-d5f8-4d29-b3ec-01a60b2e4378","Type":"ContainerDied","Data":"70fea575f89497c81409331c97f7e6fd92cd6e3603cbeae052c6aa4bfb593f2f"} Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.177685 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9bwh" event={"ID":"ceb39900-d5f8-4d29-b3ec-01a60b2e4378","Type":"ContainerStarted","Data":"7f1c1064a5dc4d51f30f309b98ef46007ba427121b12e5cbc22bf4e8ea704278"} Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.178920 4740 generic.go:334] "Generic (PLEG): container finished" podID="f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879" containerID="1d7954419db6e53e06174cbd706a01584b8698d047c9ce53ce889ec7ca2df0a7" exitCode=0 Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.179040 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfp42" event={"ID":"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879","Type":"ContainerDied","Data":"1d7954419db6e53e06174cbd706a01584b8698d047c9ce53ce889ec7ca2df0a7"} Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.179078 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfp42" event={"ID":"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879","Type":"ContainerStarted","Data":"c1fb40a4a0368897cd9529c0889957d99f7c339227867b00307265fb43a551a1"} Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.234236 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gk89z" podStartSLOduration=2.584012681 podStartE2EDuration="4.234216358s" podCreationTimestamp="2026-01-05 13:53:26 +0000 UTC" firstStartedPulling="2026-01-05 13:53:28.116429091 +0000 UTC m=+257.423337660" lastFinishedPulling="2026-01-05 13:53:29.766632768 +0000 UTC m=+259.073541337" observedRunningTime="2026-01-05 13:53:30.210188903 +0000 UTC m=+259.517097492" watchObservedRunningTime="2026-01-05 13:53:30.234216358 +0000 UTC m=+259.541124947" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.706014 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5zsln"] Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.707241 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.718598 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zsln"] Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.877921 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-utilities\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.877969 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-catalog-content\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.878002 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4vdh\" (UniqueName: \"kubernetes.io/projected/68cdb1a1-6b8f-410a-955a-aa1077491ae7-kube-api-access-f4vdh\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.978742 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-utilities\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.978796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-catalog-content\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.978856 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4vdh\" (UniqueName: \"kubernetes.io/projected/68cdb1a1-6b8f-410a-955a-aa1077491ae7-kube-api-access-f4vdh\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.979260 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-utilities\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:30 crc kubenswrapper[4740]: I0105 13:53:30.979298 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-catalog-content\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.018430 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4vdh\" (UniqueName: \"kubernetes.io/projected/68cdb1a1-6b8f-410a-955a-aa1077491ae7-kube-api-access-f4vdh\") pod \"redhat-marketplace-5zsln\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.025036 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.189538 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfp42" event={"ID":"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879","Type":"ContainerStarted","Data":"0f6c040c8acad5268122fde9facff55923301e305bf5d00eb37a2c6210f585c7"} Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.191086 4740 generic.go:334] "Generic (PLEG): container finished" podID="836c7750-5680-4a56-8947-2df3b121bb3f" containerID="8efbd7c65ee3adf250b2c1c2c44f2e23f0d6dec238b5f0ce67bb24f842368513" exitCode=0 Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.191132 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7vp5p" event={"ID":"836c7750-5680-4a56-8947-2df3b121bb3f","Type":"ContainerDied","Data":"8efbd7c65ee3adf250b2c1c2c44f2e23f0d6dec238b5f0ce67bb24f842368513"} Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.210481 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9bwh" event={"ID":"ceb39900-d5f8-4d29-b3ec-01a60b2e4378","Type":"ContainerStarted","Data":"340bd579dad8f9b31db95cc277c9d735ee559cfca6f3769a50b33fa5dae873e6"} Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.290824 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zsln"] Jan 05 13:53:31 crc kubenswrapper[4740]: W0105 13:53:31.305132 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68cdb1a1_6b8f_410a_955a_aa1077491ae7.slice/crio-fba7f07329158a6e25d08da21eb5fb92ac4cfe62005ff9deae98958c10726151 WatchSource:0}: Error finding container fba7f07329158a6e25d08da21eb5fb92ac4cfe62005ff9deae98958c10726151: Status 404 returned error can't find the container with id fba7f07329158a6e25d08da21eb5fb92ac4cfe62005ff9deae98958c10726151 Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.499821 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mkqxp"] Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.511738 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkqxp"] Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.511854 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.691862 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc07eab5-3d3e-4da1-aff1-dc180039a90a-catalog-content\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.691920 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc07eab5-3d3e-4da1-aff1-dc180039a90a-utilities\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.691990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhrg\" (UniqueName: \"kubernetes.io/projected/dc07eab5-3d3e-4da1-aff1-dc180039a90a-kube-api-access-9dhrg\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.793318 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc07eab5-3d3e-4da1-aff1-dc180039a90a-catalog-content\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.793387 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc07eab5-3d3e-4da1-aff1-dc180039a90a-utilities\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.793433 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhrg\" (UniqueName: \"kubernetes.io/projected/dc07eab5-3d3e-4da1-aff1-dc180039a90a-kube-api-access-9dhrg\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.794483 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc07eab5-3d3e-4da1-aff1-dc180039a90a-catalog-content\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.794655 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc07eab5-3d3e-4da1-aff1-dc180039a90a-utilities\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.814418 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhrg\" (UniqueName: \"kubernetes.io/projected/dc07eab5-3d3e-4da1-aff1-dc180039a90a-kube-api-access-9dhrg\") pod \"redhat-operators-mkqxp\" (UID: \"dc07eab5-3d3e-4da1-aff1-dc180039a90a\") " pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:31 crc kubenswrapper[4740]: I0105 13:53:31.934433 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.223999 4740 generic.go:334] "Generic (PLEG): container finished" podID="f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879" containerID="0f6c040c8acad5268122fde9facff55923301e305bf5d00eb37a2c6210f585c7" exitCode=0 Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.224101 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfp42" event={"ID":"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879","Type":"ContainerDied","Data":"0f6c040c8acad5268122fde9facff55923301e305bf5d00eb37a2c6210f585c7"} Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.234294 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7vp5p" event={"ID":"836c7750-5680-4a56-8947-2df3b121bb3f","Type":"ContainerStarted","Data":"33606fc3f5649f1e278d9efda8d0da23cc1653a1b6bb0bce02b8f3f2a15d1199"} Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.236616 4740 generic.go:334] "Generic (PLEG): container finished" podID="ceb39900-d5f8-4d29-b3ec-01a60b2e4378" containerID="340bd579dad8f9b31db95cc277c9d735ee559cfca6f3769a50b33fa5dae873e6" exitCode=0 Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.236705 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9bwh" event={"ID":"ceb39900-d5f8-4d29-b3ec-01a60b2e4378","Type":"ContainerDied","Data":"340bd579dad8f9b31db95cc277c9d735ee559cfca6f3769a50b33fa5dae873e6"} Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.239510 4740 generic.go:334] "Generic (PLEG): container finished" podID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerID="55395d17ce98197832b4910c9c23d0542e0da7a26c41a292c36bc67120c3cb05" exitCode=0 Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.239571 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerDied","Data":"55395d17ce98197832b4910c9c23d0542e0da7a26c41a292c36bc67120c3cb05"} Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.239751 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerStarted","Data":"fba7f07329158a6e25d08da21eb5fb92ac4cfe62005ff9deae98958c10726151"} Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.308010 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7vp5p" podStartSLOduration=3.8100883 podStartE2EDuration="6.307990886s" podCreationTimestamp="2026-01-05 13:53:26 +0000 UTC" firstStartedPulling="2026-01-05 13:53:29.130696584 +0000 UTC m=+258.437605153" lastFinishedPulling="2026-01-05 13:53:31.62859917 +0000 UTC m=+260.935507739" observedRunningTime="2026-01-05 13:53:32.30331826 +0000 UTC m=+261.610226859" watchObservedRunningTime="2026-01-05 13:53:32.307990886 +0000 UTC m=+261.614899465" Jan 05 13:53:32 crc kubenswrapper[4740]: I0105 13:53:32.336670 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkqxp"] Jan 05 13:53:32 crc kubenswrapper[4740]: W0105 13:53:32.356030 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc07eab5_3d3e_4da1_aff1_dc180039a90a.slice/crio-90d414076e9b403f1d8532209c1e5b70fdfb519f2083df2732297fedc20f8bae WatchSource:0}: Error finding container 90d414076e9b403f1d8532209c1e5b70fdfb519f2083df2732297fedc20f8bae: Status 404 returned error can't find the container with id 90d414076e9b403f1d8532209c1e5b70fdfb519f2083df2732297fedc20f8bae Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.255976 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9bwh" event={"ID":"ceb39900-d5f8-4d29-b3ec-01a60b2e4378","Type":"ContainerStarted","Data":"a53a80f4aad9c504e50b8d8df60aed5d7ed5f5c9eea0d85fa15f8df55e90d6d1"} Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.258757 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerStarted","Data":"52727c5539b679c68105c3c6f7c1710d018a82ee22fb3fcf24617e682c0e09ea"} Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.261595 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfp42" event={"ID":"f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879","Type":"ContainerStarted","Data":"6722cb4b55185ee91c1ee5c192b2dd06c0bd5b60d6dbcb6b135272fd8e2e8690"} Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.262867 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerID="17ef6c673aaa2ba795125d84a19299936cf03a0d513ca1538bce094e5b90db66" exitCode=0 Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.263379 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkqxp" event={"ID":"dc07eab5-3d3e-4da1-aff1-dc180039a90a","Type":"ContainerDied","Data":"17ef6c673aaa2ba795125d84a19299936cf03a0d513ca1538bce094e5b90db66"} Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.263419 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkqxp" event={"ID":"dc07eab5-3d3e-4da1-aff1-dc180039a90a","Type":"ContainerStarted","Data":"90d414076e9b403f1d8532209c1e5b70fdfb519f2083df2732297fedc20f8bae"} Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.281626 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g9bwh" podStartSLOduration=1.622252142 podStartE2EDuration="4.281609452s" podCreationTimestamp="2026-01-05 13:53:29 +0000 UTC" firstStartedPulling="2026-01-05 13:53:30.17922355 +0000 UTC m=+259.486132129" lastFinishedPulling="2026-01-05 13:53:32.83858085 +0000 UTC m=+262.145489439" observedRunningTime="2026-01-05 13:53:33.279918534 +0000 UTC m=+262.586827113" watchObservedRunningTime="2026-01-05 13:53:33.281609452 +0000 UTC m=+262.588518031" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.303747 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dgjkq"] Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.306058 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.319641 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgjkq"] Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.326192 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lfp42" podStartSLOduration=2.78133424 podStartE2EDuration="5.326174492s" podCreationTimestamp="2026-01-05 13:53:28 +0000 UTC" firstStartedPulling="2026-01-05 13:53:30.181009179 +0000 UTC m=+259.487917758" lastFinishedPulling="2026-01-05 13:53:32.725849391 +0000 UTC m=+262.032758010" observedRunningTime="2026-01-05 13:53:33.323391565 +0000 UTC m=+262.630300134" watchObservedRunningTime="2026-01-05 13:53:33.326174492 +0000 UTC m=+262.633083071" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.429769 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c01131-569c-43e4-b848-8d4b49a383d4-catalog-content\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.430086 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6kh\" (UniqueName: \"kubernetes.io/projected/02c01131-569c-43e4-b848-8d4b49a383d4-kube-api-access-px6kh\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.430586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c01131-569c-43e4-b848-8d4b49a383d4-utilities\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.532633 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c01131-569c-43e4-b848-8d4b49a383d4-utilities\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.533040 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c01131-569c-43e4-b848-8d4b49a383d4-catalog-content\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.533148 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6kh\" (UniqueName: \"kubernetes.io/projected/02c01131-569c-43e4-b848-8d4b49a383d4-kube-api-access-px6kh\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.533289 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c01131-569c-43e4-b848-8d4b49a383d4-utilities\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.533394 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c01131-569c-43e4-b848-8d4b49a383d4-catalog-content\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.553025 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6kh\" (UniqueName: \"kubernetes.io/projected/02c01131-569c-43e4-b848-8d4b49a383d4-kube-api-access-px6kh\") pod \"certified-operators-dgjkq\" (UID: \"02c01131-569c-43e4-b848-8d4b49a383d4\") " pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:33 crc kubenswrapper[4740]: I0105 13:53:33.656304 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.067072 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgjkq"] Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.228334 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.228696 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.270424 4740 generic.go:334] "Generic (PLEG): container finished" podID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerID="52727c5539b679c68105c3c6f7c1710d018a82ee22fb3fcf24617e682c0e09ea" exitCode=0 Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.270498 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerDied","Data":"52727c5539b679c68105c3c6f7c1710d018a82ee22fb3fcf24617e682c0e09ea"} Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.276519 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkqxp" event={"ID":"dc07eab5-3d3e-4da1-aff1-dc180039a90a","Type":"ContainerStarted","Data":"edb3a61fa891df698ccac2bce4787b0ab2049c81d948a4a243b5456a376d884c"} Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.279741 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgjkq" event={"ID":"02c01131-569c-43e4-b848-8d4b49a383d4","Type":"ContainerStarted","Data":"6083c07037334458adbeddf57da0ffab148cb7a2a69190aa0dd3fedf2fcb62a1"} Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.279776 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgjkq" event={"ID":"02c01131-569c-43e4-b848-8d4b49a383d4","Type":"ContainerStarted","Data":"f1654d0d98e32b24ba0b4b29e66bf89722459840c47c9ca20b913ad1fab34cb4"} Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.280941 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.306295 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bgrks"] Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.307846 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.315331 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgrks"] Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.341035 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dwcfs" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.422879 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.422935 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.450765 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-catalog-content\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.450830 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7hl\" (UniqueName: \"kubernetes.io/projected/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-kube-api-access-qh7hl\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.451355 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-utilities\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.458966 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.504674 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58c887dd9d-wq6r9"] Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.504899 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" podUID="6f00b825-8d7b-406a-9a52-710436e38e99" containerName="controller-manager" containerID="cri-o://ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa" gracePeriod=30 Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.552890 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-utilities\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.552960 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-catalog-content\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.552984 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7hl\" (UniqueName: \"kubernetes.io/projected/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-kube-api-access-qh7hl\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.553761 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-utilities\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.554017 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-catalog-content\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.583983 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7hl\" (UniqueName: \"kubernetes.io/projected/6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7-kube-api-access-qh7hl\") pod \"community-operators-bgrks\" (UID: \"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7\") " pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.644186 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.928798 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgrks"] Jan 05 13:53:34 crc kubenswrapper[4740]: W0105 13:53:34.941003 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bf07ffe_6600_4ad4_b94c_3b7ac4a613a7.slice/crio-74f9182e55452bb2893de3d487a435c8c6a53f1e62c55e9a913ced695df33b21 WatchSource:0}: Error finding container 74f9182e55452bb2893de3d487a435c8c6a53f1e62c55e9a913ced695df33b21: Status 404 returned error can't find the container with id 74f9182e55452bb2893de3d487a435c8c6a53f1e62c55e9a913ced695df33b21 Jan 05 13:53:34 crc kubenswrapper[4740]: I0105 13:53:34.993125 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.160306 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-client-ca\") pod \"6f00b825-8d7b-406a-9a52-710436e38e99\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.160967 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f00b825-8d7b-406a-9a52-710436e38e99" (UID: "6f00b825-8d7b-406a-9a52-710436e38e99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.161030 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfsgr\" (UniqueName: \"kubernetes.io/projected/6f00b825-8d7b-406a-9a52-710436e38e99-kube-api-access-nfsgr\") pod \"6f00b825-8d7b-406a-9a52-710436e38e99\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.161820 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-config" (OuterVolumeSpecName: "config") pod "6f00b825-8d7b-406a-9a52-710436e38e99" (UID: "6f00b825-8d7b-406a-9a52-710436e38e99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.161837 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-config\") pod \"6f00b825-8d7b-406a-9a52-710436e38e99\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.161948 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-proxy-ca-bundles\") pod \"6f00b825-8d7b-406a-9a52-710436e38e99\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.162027 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f00b825-8d7b-406a-9a52-710436e38e99-serving-cert\") pod \"6f00b825-8d7b-406a-9a52-710436e38e99\" (UID: \"6f00b825-8d7b-406a-9a52-710436e38e99\") " Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.162350 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6f00b825-8d7b-406a-9a52-710436e38e99" (UID: "6f00b825-8d7b-406a-9a52-710436e38e99"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.162642 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.162666 4740 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.162680 4740 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f00b825-8d7b-406a-9a52-710436e38e99-client-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.166912 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f00b825-8d7b-406a-9a52-710436e38e99-kube-api-access-nfsgr" (OuterVolumeSpecName: "kube-api-access-nfsgr") pod "6f00b825-8d7b-406a-9a52-710436e38e99" (UID: "6f00b825-8d7b-406a-9a52-710436e38e99"). InnerVolumeSpecName "kube-api-access-nfsgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.169996 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f00b825-8d7b-406a-9a52-710436e38e99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f00b825-8d7b-406a-9a52-710436e38e99" (UID: "6f00b825-8d7b-406a-9a52-710436e38e99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.263507 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfsgr\" (UniqueName: \"kubernetes.io/projected/6f00b825-8d7b-406a-9a52-710436e38e99-kube-api-access-nfsgr\") on node \"crc\" DevicePath \"\"" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.263542 4740 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f00b825-8d7b-406a-9a52-710436e38e99-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.284031 4740 generic.go:334] "Generic (PLEG): container finished" podID="6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7" containerID="3ed1664ac71e72e064a31970fe7e58df2319f61ab8e48ba992cab3f6a2cde588" exitCode=0 Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.284128 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgrks" event={"ID":"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7","Type":"ContainerDied","Data":"3ed1664ac71e72e064a31970fe7e58df2319f61ab8e48ba992cab3f6a2cde588"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.284156 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgrks" event={"ID":"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7","Type":"ContainerStarted","Data":"74f9182e55452bb2893de3d487a435c8c6a53f1e62c55e9a913ced695df33b21"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.286409 4740 generic.go:334] "Generic (PLEG): container finished" podID="6f00b825-8d7b-406a-9a52-710436e38e99" containerID="ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa" exitCode=0 Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.286450 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" event={"ID":"6f00b825-8d7b-406a-9a52-710436e38e99","Type":"ContainerDied","Data":"ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.286466 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" event={"ID":"6f00b825-8d7b-406a-9a52-710436e38e99","Type":"ContainerDied","Data":"f72fb2b46167338ccbf08499748a0e6a49174e057ea28a50d931ab101467812e"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.286481 4740 scope.go:117] "RemoveContainer" containerID="ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.286551 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c887dd9d-wq6r9" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.294801 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerStarted","Data":"778005c870171e4861881ee77cb9c8a6c66ecae130a74a8c31989f8ea4aa0261"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.297495 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerID="edb3a61fa891df698ccac2bce4787b0ab2049c81d948a4a243b5456a376d884c" exitCode=0 Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.297536 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkqxp" event={"ID":"dc07eab5-3d3e-4da1-aff1-dc180039a90a","Type":"ContainerDied","Data":"edb3a61fa891df698ccac2bce4787b0ab2049c81d948a4a243b5456a376d884c"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.299319 4740 generic.go:334] "Generic (PLEG): container finished" podID="02c01131-569c-43e4-b848-8d4b49a383d4" containerID="6083c07037334458adbeddf57da0ffab148cb7a2a69190aa0dd3fedf2fcb62a1" exitCode=0 Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.300059 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgjkq" event={"ID":"02c01131-569c-43e4-b848-8d4b49a383d4","Type":"ContainerDied","Data":"6083c07037334458adbeddf57da0ffab148cb7a2a69190aa0dd3fedf2fcb62a1"} Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.309142 4740 scope.go:117] "RemoveContainer" containerID="ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa" Jan 05 13:53:35 crc kubenswrapper[4740]: E0105 13:53:35.309712 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa\": container with ID starting with ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa not found: ID does not exist" containerID="ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.309767 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa"} err="failed to get container status \"ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa\": rpc error: code = NotFound desc = could not find container \"ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa\": container with ID starting with ca2a69e21f5045703893beece1eb892c63dab98834189d898ba79a0cc6735dfa not found: ID does not exist" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.336697 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58c887dd9d-wq6r9"] Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.339673 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58c887dd9d-wq6r9"] Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.347241 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tj7tn" Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.364430 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:53:35 crc kubenswrapper[4740]: E0105 13:53:35.364577 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:53:35 crc kubenswrapper[4740]: E0105 13:53:35.364646 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates podName:1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8 nodeName:}" failed. No retries permitted until 2026-01-05 13:54:39.364628082 +0000 UTC m=+328.671536661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-n6tdc" (UID: "1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8") : secret "prometheus-operator-admission-webhook-tls" not found Jan 05 13:53:35 crc kubenswrapper[4740]: I0105 13:53:35.379702 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5zsln" podStartSLOduration=2.80641702 podStartE2EDuration="5.379688042s" podCreationTimestamp="2026-01-05 13:53:30 +0000 UTC" firstStartedPulling="2026-01-05 13:53:32.243438319 +0000 UTC m=+261.550346898" lastFinishedPulling="2026-01-05 13:53:34.816709341 +0000 UTC m=+264.123617920" observedRunningTime="2026-01-05 13:53:35.376332989 +0000 UTC m=+264.683241568" watchObservedRunningTime="2026-01-05 13:53:35.379688042 +0000 UTC m=+264.686596621" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.426676 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.427973 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.467660 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.502503 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47"] Jan 05 13:53:36 crc kubenswrapper[4740]: E0105 13:53:36.502734 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f00b825-8d7b-406a-9a52-710436e38e99" containerName="controller-manager" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.502746 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f00b825-8d7b-406a-9a52-710436e38e99" containerName="controller-manager" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.502836 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f00b825-8d7b-406a-9a52-710436e38e99" containerName="controller-manager" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.503209 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.506505 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.507242 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.507373 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.507482 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.507585 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.507912 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.510704 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47"] Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.512874 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.690128 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-client-ca\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.690520 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-config\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.690547 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f061b27b-1e1c-42c2-9230-8f82deda6325-serving-cert\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.690612 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-proxy-ca-bundles\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.690633 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc2p4\" (UniqueName: \"kubernetes.io/projected/f061b27b-1e1c-42c2-9230-8f82deda6325-kube-api-access-qc2p4\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.791879 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-proxy-ca-bundles\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.791938 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc2p4\" (UniqueName: \"kubernetes.io/projected/f061b27b-1e1c-42c2-9230-8f82deda6325-kube-api-access-qc2p4\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.791980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-client-ca\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.792050 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-config\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.792094 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f061b27b-1e1c-42c2-9230-8f82deda6325-serving-cert\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.793513 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-proxy-ca-bundles\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.793596 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-config\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.793713 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f061b27b-1e1c-42c2-9230-8f82deda6325-client-ca\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.801504 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f061b27b-1e1c-42c2-9230-8f82deda6325-serving-cert\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.809725 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc2p4\" (UniqueName: \"kubernetes.io/projected/f061b27b-1e1c-42c2-9230-8f82deda6325-kube-api-access-qc2p4\") pod \"controller-manager-6cdbbbb5bc-rbd47\" (UID: \"f061b27b-1e1c-42c2-9230-8f82deda6325\") " pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.820587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:36 crc kubenswrapper[4740]: I0105 13:53:36.984456 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f00b825-8d7b-406a-9a52-710436e38e99" path="/var/lib/kubelet/pods/6f00b825-8d7b-406a-9a52-710436e38e99/volumes" Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.271227 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47"] Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.320860 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" event={"ID":"f061b27b-1e1c-42c2-9230-8f82deda6325","Type":"ContainerStarted","Data":"79c2077aeba2be7da836838f9b257570a9a66f57193cf09cad47e3221d96177d"} Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.323277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkqxp" event={"ID":"dc07eab5-3d3e-4da1-aff1-dc180039a90a","Type":"ContainerStarted","Data":"545d9da97cbe36b86a2c5db5019c7013002ef846ea91a312dd9d11711528aa59"} Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.327993 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgrks" event={"ID":"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7","Type":"ContainerStarted","Data":"1d6253244f7b4bb3ad853bbd12950d61a6545b051fefc5b2194a2d3c34f4d19d"} Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.330516 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgjkq" event={"ID":"02c01131-569c-43e4-b848-8d4b49a383d4","Type":"ContainerStarted","Data":"25734480a090c9d5e562a65c1fba1354bb1d9d889bb13ef8cb2eb469842b3b38"} Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.349631 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mkqxp" podStartSLOduration=3.147200685 podStartE2EDuration="6.349614685s" podCreationTimestamp="2026-01-05 13:53:31 +0000 UTC" firstStartedPulling="2026-01-05 13:53:33.264172766 +0000 UTC m=+262.571081355" lastFinishedPulling="2026-01-05 13:53:36.466586776 +0000 UTC m=+265.773495355" observedRunningTime="2026-01-05 13:53:37.348255627 +0000 UTC m=+266.655164226" watchObservedRunningTime="2026-01-05 13:53:37.349614685 +0000 UTC m=+266.656523264" Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.394599 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.653508 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:37 crc kubenswrapper[4740]: I0105 13:53:37.653581 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:38 crc kubenswrapper[4740]: I0105 13:53:38.337518 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" event={"ID":"f061b27b-1e1c-42c2-9230-8f82deda6325","Type":"ContainerStarted","Data":"642b6cdffc94c05cafdd62614d483b3daf241708b521d3a8f422c714c83ea1c7"} Jan 05 13:53:38 crc kubenswrapper[4740]: I0105 13:53:38.360531 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podStartSLOduration=4.360508604 podStartE2EDuration="4.360508604s" podCreationTimestamp="2026-01-05 13:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:53:38.352055468 +0000 UTC m=+267.658964047" watchObservedRunningTime="2026-01-05 13:53:38.360508604 +0000 UTC m=+267.667417193" Jan 05 13:53:38 crc kubenswrapper[4740]: I0105 13:53:38.699473 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7vp5p" podUID="836c7750-5680-4a56-8947-2df3b121bb3f" containerName="registry-server" probeResult="failure" output=< Jan 05 13:53:38 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 13:53:38 crc kubenswrapper[4740]: > Jan 05 13:53:38 crc kubenswrapper[4740]: I0105 13:53:38.849145 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:38 crc kubenswrapper[4740]: I0105 13:53:38.849499 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:38 crc kubenswrapper[4740]: I0105 13:53:38.888058 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.344115 4740 generic.go:334] "Generic (PLEG): container finished" podID="02c01131-569c-43e4-b848-8d4b49a383d4" containerID="25734480a090c9d5e562a65c1fba1354bb1d9d889bb13ef8cb2eb469842b3b38" exitCode=0 Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.344197 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgjkq" event={"ID":"02c01131-569c-43e4-b848-8d4b49a383d4","Type":"ContainerDied","Data":"25734480a090c9d5e562a65c1fba1354bb1d9d889bb13ef8cb2eb469842b3b38"} Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.347968 4740 generic.go:334] "Generic (PLEG): container finished" podID="6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7" containerID="1d6253244f7b4bb3ad853bbd12950d61a6545b051fefc5b2194a2d3c34f4d19d" exitCode=0 Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.348245 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgrks" event={"ID":"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7","Type":"ContainerDied","Data":"1d6253244f7b4bb3ad853bbd12950d61a6545b051fefc5b2194a2d3c34f4d19d"} Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.349601 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.361952 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.414907 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lfp42" Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.429619 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.429907 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:39 crc kubenswrapper[4740]: I0105 13:53:39.480384 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:40 crc kubenswrapper[4740]: I0105 13:53:40.403716 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g9bwh" Jan 05 13:53:41 crc kubenswrapper[4740]: I0105 13:53:41.025844 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:41 crc kubenswrapper[4740]: I0105 13:53:41.025942 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:41 crc kubenswrapper[4740]: I0105 13:53:41.093152 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:41 crc kubenswrapper[4740]: I0105 13:53:41.407905 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 13:53:41 crc kubenswrapper[4740]: I0105 13:53:41.935531 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:41 crc kubenswrapper[4740]: I0105 13:53:41.935600 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:42 crc kubenswrapper[4740]: I0105 13:53:42.365672 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgjkq" event={"ID":"02c01131-569c-43e4-b848-8d4b49a383d4","Type":"ContainerStarted","Data":"1f926909f130f55c7e98cf194727868746ce0aea04d9d5c667e66af4ee1532db"} Jan 05 13:53:42 crc kubenswrapper[4740]: I0105 13:53:42.367855 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgrks" event={"ID":"6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7","Type":"ContainerStarted","Data":"bc47231d1aa1c8ef9abb60a95ef8088c1872021e0c1b8e03570c6dab3b119651"} Jan 05 13:53:42 crc kubenswrapper[4740]: I0105 13:53:42.385187 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dgjkq" podStartSLOduration=3.223904271 podStartE2EDuration="9.385159371s" podCreationTimestamp="2026-01-05 13:53:33 +0000 UTC" firstStartedPulling="2026-01-05 13:53:35.300675543 +0000 UTC m=+264.607584122" lastFinishedPulling="2026-01-05 13:53:41.461930643 +0000 UTC m=+270.768839222" observedRunningTime="2026-01-05 13:53:42.382894108 +0000 UTC m=+271.689802697" watchObservedRunningTime="2026-01-05 13:53:42.385159371 +0000 UTC m=+271.692067950" Jan 05 13:53:42 crc kubenswrapper[4740]: I0105 13:53:42.401927 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bgrks" podStartSLOduration=2.432482206 podStartE2EDuration="8.401897306s" podCreationTimestamp="2026-01-05 13:53:34 +0000 UTC" firstStartedPulling="2026-01-05 13:53:35.285210152 +0000 UTC m=+264.592118731" lastFinishedPulling="2026-01-05 13:53:41.254625252 +0000 UTC m=+270.561533831" observedRunningTime="2026-01-05 13:53:42.399717776 +0000 UTC m=+271.706626355" watchObservedRunningTime="2026-01-05 13:53:42.401897306 +0000 UTC m=+271.708805885" Jan 05 13:53:42 crc kubenswrapper[4740]: I0105 13:53:42.980459 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mkqxp" podUID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerName="registry-server" probeResult="failure" output=< Jan 05 13:53:42 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 13:53:42 crc kubenswrapper[4740]: > Jan 05 13:53:43 crc kubenswrapper[4740]: I0105 13:53:43.656872 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:43 crc kubenswrapper[4740]: I0105 13:53:43.657673 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:43 crc kubenswrapper[4740]: I0105 13:53:43.716253 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:44 crc kubenswrapper[4740]: I0105 13:53:44.644357 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:44 crc kubenswrapper[4740]: I0105 13:53:44.644725 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:44 crc kubenswrapper[4740]: I0105 13:53:44.677868 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:53:44 crc kubenswrapper[4740]: I0105 13:53:44.956024 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gflkl" Jan 05 13:53:45 crc kubenswrapper[4740]: I0105 13:53:45.007716 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dqws6"] Jan 05 13:53:47 crc kubenswrapper[4740]: I0105 13:53:47.738583 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:47 crc kubenswrapper[4740]: I0105 13:53:47.815910 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7vp5p" Jan 05 13:53:51 crc kubenswrapper[4740]: I0105 13:53:51.979560 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:52 crc kubenswrapper[4740]: I0105 13:53:52.030170 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mkqxp" Jan 05 13:53:53 crc kubenswrapper[4740]: I0105 13:53:53.730949 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dgjkq" Jan 05 13:53:54 crc kubenswrapper[4740]: I0105 13:53:54.686339 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bgrks" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.044991 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" podUID="f488db4f-55ab-4654-b225-38742d36877c" containerName="registry" containerID="cri-o://4ff63df9dc8111a76a79899c7d8931edc8bc2574a9da763bdad4b3b0691ea8d1" gracePeriod=30 Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.568362 4740 generic.go:334] "Generic (PLEG): container finished" podID="f488db4f-55ab-4654-b225-38742d36877c" containerID="4ff63df9dc8111a76a79899c7d8931edc8bc2574a9da763bdad4b3b0691ea8d1" exitCode=0 Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.568409 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" event={"ID":"f488db4f-55ab-4654-b225-38742d36877c","Type":"ContainerDied","Data":"4ff63df9dc8111a76a79899c7d8931edc8bc2574a9da763bdad4b3b0691ea8d1"} Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.568434 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" event={"ID":"f488db4f-55ab-4654-b225-38742d36877c","Type":"ContainerDied","Data":"42104e33f0a574f91773b7df63da715a1bc79ab1e6c279c64adbcc175fbd83ca"} Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.568444 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42104e33f0a574f91773b7df63da715a1bc79ab1e6c279c64adbcc175fbd83ca" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.590436 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.701816 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.701947 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f488db4f-55ab-4654-b225-38742d36877c-ca-trust-extracted\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.702005 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-registry-tls\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.702138 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-trusted-ca\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.702204 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt2vz\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-kube-api-access-xt2vz\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.702265 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-registry-certificates\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.702304 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f488db4f-55ab-4654-b225-38742d36877c-installation-pull-secrets\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.702335 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-bound-sa-token\") pod \"f488db4f-55ab-4654-b225-38742d36877c\" (UID: \"f488db4f-55ab-4654-b225-38742d36877c\") " Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.703472 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.708101 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f488db4f-55ab-4654-b225-38742d36877c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.708162 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-kube-api-access-xt2vz" (OuterVolumeSpecName: "kube-api-access-xt2vz") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "kube-api-access-xt2vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.708718 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.709352 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.712656 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.717550 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.723913 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f488db4f-55ab-4654-b225-38742d36877c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f488db4f-55ab-4654-b225-38742d36877c" (UID: "f488db4f-55ab-4654-b225-38742d36877c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804059 4740 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f488db4f-55ab-4654-b225-38742d36877c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804137 4740 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804159 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804178 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt2vz\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-kube-api-access-xt2vz\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804200 4740 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f488db4f-55ab-4654-b225-38742d36877c-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804222 4740 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f488db4f-55ab-4654-b225-38742d36877c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.804242 4740 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f488db4f-55ab-4654-b225-38742d36877c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 05 13:54:10 crc kubenswrapper[4740]: I0105 13:54:10.834603 4740 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 05 13:54:11 crc kubenswrapper[4740]: I0105 13:54:11.579981 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dqws6" Jan 05 13:54:11 crc kubenswrapper[4740]: I0105 13:54:11.602999 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dqws6"] Jan 05 13:54:11 crc kubenswrapper[4740]: I0105 13:54:11.609730 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dqws6"] Jan 05 13:54:12 crc kubenswrapper[4740]: I0105 13:54:12.979700 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f488db4f-55ab-4654-b225-38742d36877c" path="/var/lib/kubelet/pods/f488db4f-55ab-4654-b225-38742d36877c/volumes" Jan 05 13:54:34 crc kubenswrapper[4740]: E0105 13:54:34.487612 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[tls-certificates], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" Jan 05 13:54:34 crc kubenswrapper[4740]: I0105 13:54:34.739255 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:54:39 crc kubenswrapper[4740]: I0105 13:54:39.445438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:54:39 crc kubenswrapper[4740]: I0105 13:54:39.454090 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-n6tdc\" (UID: \"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:54:39 crc kubenswrapper[4740]: I0105 13:54:39.543507 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-kshq2" Jan 05 13:54:39 crc kubenswrapper[4740]: I0105 13:54:39.551615 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:54:40 crc kubenswrapper[4740]: I0105 13:54:40.014133 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc"] Jan 05 13:54:40 crc kubenswrapper[4740]: I0105 13:54:40.780697 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" event={"ID":"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8","Type":"ContainerStarted","Data":"836375ebd681103f6d4f9c0db500a62ab5cc08e383e2bc57a93880e7d3ec19f6"} Jan 05 13:54:41 crc kubenswrapper[4740]: I0105 13:54:41.805014 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" event={"ID":"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8","Type":"ContainerStarted","Data":"803251059ac78863fda7261eae0f16876d61fed9009de0e860465cb2a83226e6"} Jan 05 13:54:41 crc kubenswrapper[4740]: I0105 13:54:41.805801 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:54:41 crc kubenswrapper[4740]: I0105 13:54:41.817015 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 13:54:41 crc kubenswrapper[4740]: I0105 13:54:41.836812 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podStartSLOduration=129.566901175 podStartE2EDuration="2m10.836785639s" podCreationTimestamp="2026-01-05 13:52:31 +0000 UTC" firstStartedPulling="2026-01-05 13:54:40.023535332 +0000 UTC m=+329.330443921" lastFinishedPulling="2026-01-05 13:54:41.293419816 +0000 UTC m=+330.600328385" observedRunningTime="2026-01-05 13:54:41.830648079 +0000 UTC m=+331.137556688" watchObservedRunningTime="2026-01-05 13:54:41.836785639 +0000 UTC m=+331.143694258" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.592762 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-4ms78"] Jan 05 13:54:42 crc kubenswrapper[4740]: E0105 13:54:42.593351 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f488db4f-55ab-4654-b225-38742d36877c" containerName="registry" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.593367 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f488db4f-55ab-4654-b225-38742d36877c" containerName="registry" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.593517 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f488db4f-55ab-4654-b225-38742d36877c" containerName="registry" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.594222 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.607181 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.607428 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.607635 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.609128 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-zchv7" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.618517 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-4ms78"] Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.790265 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg242\" (UniqueName: \"kubernetes.io/projected/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-kube-api-access-vg242\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.790605 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.790860 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.791023 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-metrics-client-ca\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.892147 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.892251 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-metrics-client-ca\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.892357 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg242\" (UniqueName: \"kubernetes.io/projected/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-kube-api-access-vg242\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.892400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.893728 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-metrics-client-ca\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: E0105 13:54:42.893769 4740 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Jan 05 13:54:42 crc kubenswrapper[4740]: E0105 13:54:42.893842 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-tls podName:f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a nodeName:}" failed. No retries permitted until 2026-01-05 13:54:43.393825651 +0000 UTC m=+332.700734230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-tls") pod "prometheus-operator-db54df47d-4ms78" (UID: "f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a") : secret "prometheus-operator-tls" not found Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.898488 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:42 crc kubenswrapper[4740]: I0105 13:54:42.924364 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg242\" (UniqueName: \"kubernetes.io/projected/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-kube-api-access-vg242\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:43 crc kubenswrapper[4740]: I0105 13:54:43.399014 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:43 crc kubenswrapper[4740]: I0105 13:54:43.402944 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4ms78\" (UID: \"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:43 crc kubenswrapper[4740]: I0105 13:54:43.508891 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" Jan 05 13:54:43 crc kubenswrapper[4740]: I0105 13:54:43.966997 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-4ms78"] Jan 05 13:54:44 crc kubenswrapper[4740]: I0105 13:54:44.823946 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" event={"ID":"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a","Type":"ContainerStarted","Data":"2d528037549fb05f74dd44ad6983956d0d2b034b01e1b3c66eff5e1c79955ba7"} Jan 05 13:54:46 crc kubenswrapper[4740]: I0105 13:54:46.835320 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" event={"ID":"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a","Type":"ContainerStarted","Data":"32a5f9866e78478d2d79f02bb3331a7024be950785eb5840cebd35f5d87e0fb9"} Jan 05 13:54:46 crc kubenswrapper[4740]: I0105 13:54:46.835714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" event={"ID":"f83ea5ed-3f8c-453c-9f10-9b0954d1fe7a","Type":"ContainerStarted","Data":"1acbb7c866cadd1dd6786380194047283293f6c7a6be2e43bad236af7949e349"} Jan 05 13:54:46 crc kubenswrapper[4740]: I0105 13:54:46.853435 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-4ms78" podStartSLOduration=3.010851046 podStartE2EDuration="4.853420186s" podCreationTimestamp="2026-01-05 13:54:42 +0000 UTC" firstStartedPulling="2026-01-05 13:54:43.97852441 +0000 UTC m=+333.285433019" lastFinishedPulling="2026-01-05 13:54:45.82109358 +0000 UTC m=+335.128002159" observedRunningTime="2026-01-05 13:54:46.851158365 +0000 UTC m=+336.158066964" watchObservedRunningTime="2026-01-05 13:54:46.853420186 +0000 UTC m=+336.160328755" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.004621 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9"] Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.006248 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.008236 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-jk8vn" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.008748 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.009709 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.025607 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9"] Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.034173 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-74p8k"] Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.035259 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.036955 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.036965 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-ncqcr" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.037180 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.105545 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g"] Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.111975 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.117764 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.117991 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-sq2r8" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.118744 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.119043 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.131221 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g"] Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180119 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg54b\" (UniqueName: \"kubernetes.io/projected/86365d76-4bbe-4bc0-8d7d-f70921eb51db-kube-api-access-lg54b\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180223 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-sys\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-root\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180571 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-tls\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180620 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86365d76-4bbe-4bc0-8d7d-f70921eb51db-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180681 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-textfile\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180898 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86365d76-4bbe-4bc0-8d7d-f70921eb51db-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180948 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-wtmp\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.180969 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86365d76-4bbe-4bc0-8d7d-f70921eb51db-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.181040 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cszpr\" (UniqueName: \"kubernetes.io/projected/821ac131-9526-4465-b09b-836d409a07e5-kube-api-access-cszpr\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.181121 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/821ac131-9526-4465-b09b-836d409a07e5-metrics-client-ca\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.181141 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.281901 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282014 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8faf659e-ecc1-4bbb-aa76-c781874ccb72-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282049 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-wtmp\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282094 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86365d76-4bbe-4bc0-8d7d-f70921eb51db-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282215 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cszpr\" (UniqueName: \"kubernetes.io/projected/821ac131-9526-4465-b09b-836d409a07e5-kube-api-access-cszpr\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282271 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/821ac131-9526-4465-b09b-836d409a07e5-metrics-client-ca\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282427 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-wtmp\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282646 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282706 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282750 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8faf659e-ecc1-4bbb-aa76-c781874ccb72-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg54b\" (UniqueName: \"kubernetes.io/projected/86365d76-4bbe-4bc0-8d7d-f70921eb51db-kube-api-access-lg54b\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282810 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-sys\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282884 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-root\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282914 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-tls\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282949 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86365d76-4bbe-4bc0-8d7d-f70921eb51db-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.282972 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-textfile\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283010 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257xr\" (UniqueName: \"kubernetes.io/projected/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-api-access-257xr\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283036 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86365d76-4bbe-4bc0-8d7d-f70921eb51db-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283211 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-sys\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86365d76-4bbe-4bc0-8d7d-f70921eb51db-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/821ac131-9526-4465-b09b-836d409a07e5-root\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283654 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-textfile\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.283809 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/821ac131-9526-4465-b09b-836d409a07e5-metrics-client-ca\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.288811 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86365d76-4bbe-4bc0-8d7d-f70921eb51db-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.288986 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/86365d76-4bbe-4bc0-8d7d-f70921eb51db-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.289342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-tls\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.290645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/821ac131-9526-4465-b09b-836d409a07e5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.302603 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cszpr\" (UniqueName: \"kubernetes.io/projected/821ac131-9526-4465-b09b-836d409a07e5-kube-api-access-cszpr\") pod \"node-exporter-74p8k\" (UID: \"821ac131-9526-4465-b09b-836d409a07e5\") " pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.303893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg54b\" (UniqueName: \"kubernetes.io/projected/86365d76-4bbe-4bc0-8d7d-f70921eb51db-kube-api-access-lg54b\") pod \"openshift-state-metrics-566fddb674-v8fb9\" (UID: \"86365d76-4bbe-4bc0-8d7d-f70921eb51db\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.321711 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.384056 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257xr\" (UniqueName: \"kubernetes.io/projected/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-api-access-257xr\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.384156 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.384186 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8faf659e-ecc1-4bbb-aa76-c781874ccb72-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.384240 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.384264 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8faf659e-ecc1-4bbb-aa76-c781874ccb72-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.384294 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.385339 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.385388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8faf659e-ecc1-4bbb-aa76-c781874ccb72-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.388309 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8faf659e-ecc1-4bbb-aa76-c781874ccb72-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.389781 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.397425 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.400944 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257xr\" (UniqueName: \"kubernetes.io/projected/8faf659e-ecc1-4bbb-aa76-c781874ccb72-kube-api-access-257xr\") pod \"kube-state-metrics-777cb5bd5d-bxk6g\" (UID: \"8faf659e-ecc1-4bbb-aa76-c781874ccb72\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.408596 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-74p8k" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.427876 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.748667 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9"] Jan 05 13:54:49 crc kubenswrapper[4740]: W0105 13:54:49.751798 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86365d76_4bbe_4bc0_8d7d_f70921eb51db.slice/crio-895ad04b9c1d441c42904a04d85694c1b63b161dacaec70e474ed4a29f3ea3fc WatchSource:0}: Error finding container 895ad04b9c1d441c42904a04d85694c1b63b161dacaec70e474ed4a29f3ea3fc: Status 404 returned error can't find the container with id 895ad04b9c1d441c42904a04d85694c1b63b161dacaec70e474ed4a29f3ea3fc Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.825432 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g"] Jan 05 13:54:49 crc kubenswrapper[4740]: W0105 13:54:49.833409 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8faf659e_ecc1_4bbb_aa76_c781874ccb72.slice/crio-3140b3ec947a496cb7006e47e5ba8a1d845c6e7c0d0e21c674fc111fa0b4f580 WatchSource:0}: Error finding container 3140b3ec947a496cb7006e47e5ba8a1d845c6e7c0d0e21c674fc111fa0b4f580: Status 404 returned error can't find the container with id 3140b3ec947a496cb7006e47e5ba8a1d845c6e7c0d0e21c674fc111fa0b4f580 Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.854998 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" event={"ID":"86365d76-4bbe-4bc0-8d7d-f70921eb51db","Type":"ContainerStarted","Data":"895ad04b9c1d441c42904a04d85694c1b63b161dacaec70e474ed4a29f3ea3fc"} Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.856221 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74p8k" event={"ID":"821ac131-9526-4465-b09b-836d409a07e5","Type":"ContainerStarted","Data":"e7307cde50e0c15594967894bafd93b17f534d84c7c2d4b0f03761fc0c582222"} Jan 05 13:54:49 crc kubenswrapper[4740]: I0105 13:54:49.857281 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" event={"ID":"8faf659e-ecc1-4bbb-aa76-c781874ccb72","Type":"ContainerStarted","Data":"3140b3ec947a496cb7006e47e5ba8a1d845c6e7c0d0e21c674fc111fa0b4f580"} Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.057128 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.059273 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.061168 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.061940 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.062106 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.062279 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-gcqtv" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.071727 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.071784 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.071906 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.071987 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.076810 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.088669 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095501 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqwjb\" (UniqueName: \"kubernetes.io/projected/88defe7b-d680-4acf-880d-ac342e7f0a73-kube-api-access-qqwjb\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095554 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88defe7b-d680-4acf-880d-ac342e7f0a73-config-out\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095573 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-config-volume\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095596 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095622 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88defe7b-d680-4acf-880d-ac342e7f0a73-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095651 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095668 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88defe7b-d680-4acf-880d-ac342e7f0a73-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095685 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095724 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095740 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88defe7b-d680-4acf-880d-ac342e7f0a73-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095757 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88defe7b-d680-4acf-880d-ac342e7f0a73-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.095780 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-web-config\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197173 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqwjb\" (UniqueName: \"kubernetes.io/projected/88defe7b-d680-4acf-880d-ac342e7f0a73-kube-api-access-qqwjb\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197224 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88defe7b-d680-4acf-880d-ac342e7f0a73-config-out\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197242 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-config-volume\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197258 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197278 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88defe7b-d680-4acf-880d-ac342e7f0a73-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197303 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197318 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88defe7b-d680-4acf-880d-ac342e7f0a73-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197344 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197384 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88defe7b-d680-4acf-880d-ac342e7f0a73-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197416 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88defe7b-d680-4acf-880d-ac342e7f0a73-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197438 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-web-config\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.197835 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88defe7b-d680-4acf-880d-ac342e7f0a73-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.199595 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88defe7b-d680-4acf-880d-ac342e7f0a73-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.199654 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88defe7b-d680-4acf-880d-ac342e7f0a73-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.202181 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-web-config\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.202185 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.202214 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.202716 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.202921 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88defe7b-d680-4acf-880d-ac342e7f0a73-config-out\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.205534 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-config-volume\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.206717 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88defe7b-d680-4acf-880d-ac342e7f0a73-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.210843 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88defe7b-d680-4acf-880d-ac342e7f0a73-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.215380 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqwjb\" (UniqueName: \"kubernetes.io/projected/88defe7b-d680-4acf-880d-ac342e7f0a73-kube-api-access-qqwjb\") pod \"alertmanager-main-0\" (UID: \"88defe7b-d680-4acf-880d-ac342e7f0a73\") " pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.372130 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.801972 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Jan 05 13:54:50 crc kubenswrapper[4740]: W0105 13:54:50.810383 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88defe7b_d680_4acf_880d_ac342e7f0a73.slice/crio-a84e586fe439c0b3acc1517dea12121f71e8c9384df53cb7f3a3d79ee6538b14 WatchSource:0}: Error finding container a84e586fe439c0b3acc1517dea12121f71e8c9384df53cb7f3a3d79ee6538b14: Status 404 returned error can't find the container with id a84e586fe439c0b3acc1517dea12121f71e8c9384df53cb7f3a3d79ee6538b14 Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.868201 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" event={"ID":"86365d76-4bbe-4bc0-8d7d-f70921eb51db","Type":"ContainerStarted","Data":"53d1fcb9acfbf74a394ba0043357685df3216664fff1c61c0c9683e49ae8a6af"} Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.868246 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" event={"ID":"86365d76-4bbe-4bc0-8d7d-f70921eb51db","Type":"ContainerStarted","Data":"424d12e88f727e6f9d3fe72bdeae7bab2e97040f0d35e2422a169d26c938e810"} Jan 05 13:54:50 crc kubenswrapper[4740]: I0105 13:54:50.869539 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"a84e586fe439c0b3acc1517dea12121f71e8c9384df53cb7f3a3d79ee6538b14"} Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.052612 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-78985bc954-b6gsd"] Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.054601 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.056148 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.056243 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.056920 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.057027 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3j7obm6dhctqa" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.057196 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.057344 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-rgpxm" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.057411 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.080859 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-78985bc954-b6gsd"] Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107412 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107458 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107495 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-tls\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107511 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7sz\" (UniqueName: \"kubernetes.io/projected/3c267f0a-b0bb-43fe-9a21-92472096a632-kube-api-access-hd7sz\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107537 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-grpc-tls\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107556 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c267f0a-b0bb-43fe-9a21-92472096a632-metrics-client-ca\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107573 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.107602 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208109 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-tls\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208182 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7sz\" (UniqueName: \"kubernetes.io/projected/3c267f0a-b0bb-43fe-9a21-92472096a632-kube-api-access-hd7sz\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208208 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-grpc-tls\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208230 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c267f0a-b0bb-43fe-9a21-92472096a632-metrics-client-ca\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208250 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208279 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.208311 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.210662 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c267f0a-b0bb-43fe-9a21-92472096a632-metrics-client-ca\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.214171 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.214715 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-grpc-tls\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.214852 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.215003 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.215132 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-tls\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.216470 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c267f0a-b0bb-43fe-9a21-92472096a632-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.224318 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7sz\" (UniqueName: \"kubernetes.io/projected/3c267f0a-b0bb-43fe-9a21-92472096a632-kube-api-access-hd7sz\") pod \"thanos-querier-78985bc954-b6gsd\" (UID: \"3c267f0a-b0bb-43fe-9a21-92472096a632\") " pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.382504 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.879330 4740 generic.go:334] "Generic (PLEG): container finished" podID="821ac131-9526-4465-b09b-836d409a07e5" containerID="751fd55fc8b4552a39f1be59d4af15940f38e7861c64a2f1573533de44feec72" exitCode=0 Jan 05 13:54:51 crc kubenswrapper[4740]: I0105 13:54:51.879390 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74p8k" event={"ID":"821ac131-9526-4465-b09b-836d409a07e5","Type":"ContainerDied","Data":"751fd55fc8b4552a39f1be59d4af15940f38e7861c64a2f1573533de44feec72"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.487199 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-78985bc954-b6gsd"] Jan 05 13:54:52 crc kubenswrapper[4740]: W0105 13:54:52.742746 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c267f0a_b0bb_43fe_9a21_92472096a632.slice/crio-6815385d6366df97ecfed33c7ffef08b2efd9c0b60fbed7a58b63063d1d50be8 WatchSource:0}: Error finding container 6815385d6366df97ecfed33c7ffef08b2efd9c0b60fbed7a58b63063d1d50be8: Status 404 returned error can't find the container with id 6815385d6366df97ecfed33c7ffef08b2efd9c0b60fbed7a58b63063d1d50be8 Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.889358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" event={"ID":"8faf659e-ecc1-4bbb-aa76-c781874ccb72","Type":"ContainerStarted","Data":"398db1537122ab8f1c6918c6cd23258eb5d5c0ba006f1f866b20a511b79e7cd4"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.889805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" event={"ID":"8faf659e-ecc1-4bbb-aa76-c781874ccb72","Type":"ContainerStarted","Data":"d3efe3f01349c1d0c783b9d661d470233c9ed6c35a9cbfe4e18f05b354428f87"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.893733 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" event={"ID":"86365d76-4bbe-4bc0-8d7d-f70921eb51db","Type":"ContainerStarted","Data":"a52ce2cf4e0c0865e5eff72510e2d20eb9fad043b198a015499ccdb973d36592"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.899579 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74p8k" event={"ID":"821ac131-9526-4465-b09b-836d409a07e5","Type":"ContainerStarted","Data":"9b7cca35dfd630a85affbbe0e942141bced4d9272e5f327f7badbfb905b69558"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.899751 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-74p8k" event={"ID":"821ac131-9526-4465-b09b-836d409a07e5","Type":"ContainerStarted","Data":"35296cc19b9263032fa21b3008237113ffa2b7d2a2076bebd0d89f3077b2ebb8"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.901294 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"6815385d6366df97ecfed33c7ffef08b2efd9c0b60fbed7a58b63063d1d50be8"} Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.915968 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-v8fb9" podStartSLOduration=2.863368078 podStartE2EDuration="4.91595163s" podCreationTimestamp="2026-01-05 13:54:48 +0000 UTC" firstStartedPulling="2026-01-05 13:54:50.068888504 +0000 UTC m=+339.375797073" lastFinishedPulling="2026-01-05 13:54:52.121472046 +0000 UTC m=+341.428380625" observedRunningTime="2026-01-05 13:54:52.912560825 +0000 UTC m=+342.219469394" watchObservedRunningTime="2026-01-05 13:54:52.91595163 +0000 UTC m=+342.222860209" Jan 05 13:54:52 crc kubenswrapper[4740]: I0105 13:54:52.929611 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-74p8k" podStartSLOduration=2.592097811 podStartE2EDuration="3.929595368s" podCreationTimestamp="2026-01-05 13:54:49 +0000 UTC" firstStartedPulling="2026-01-05 13:54:49.444881627 +0000 UTC m=+338.751790206" lastFinishedPulling="2026-01-05 13:54:50.782379184 +0000 UTC m=+340.089287763" observedRunningTime="2026-01-05 13:54:52.926372188 +0000 UTC m=+342.233280797" watchObservedRunningTime="2026-01-05 13:54:52.929595368 +0000 UTC m=+342.236503947" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.754420 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bd5cd66dc-t9skx"] Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.755235 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.781543 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bd5cd66dc-t9skx"] Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852011 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-oauth-serving-cert\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852099 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-trusted-ca-bundle\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852127 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-serving-cert\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852157 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-service-ca\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852177 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhrq\" (UniqueName: \"kubernetes.io/projected/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-kube-api-access-srhrq\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852199 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-oauth-config\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.852215 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-config\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.909446 4740 generic.go:334] "Generic (PLEG): container finished" podID="88defe7b-d680-4acf-880d-ac342e7f0a73" containerID="de7eda38dd0db34ae50f27c1fcc1b733db2e4e88988351ea9c90164b444531d3" exitCode=0 Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.909513 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerDied","Data":"de7eda38dd0db34ae50f27c1fcc1b733db2e4e88988351ea9c90164b444531d3"} Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.911870 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" event={"ID":"8faf659e-ecc1-4bbb-aa76-c781874ccb72","Type":"ContainerStarted","Data":"48920b4c8a8b14fd7f1c7c82d837f0c5ac05123851b6699fe93c41808313b272"} Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953110 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-trusted-ca-bundle\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953168 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-serving-cert\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953230 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-service-ca\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953266 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhrq\" (UniqueName: \"kubernetes.io/projected/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-kube-api-access-srhrq\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953306 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-oauth-config\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953332 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-config\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.953373 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-oauth-serving-cert\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.957829 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-oauth-serving-cert\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.958030 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-config\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.962497 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-service-ca\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.964032 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-trusted-ca-bundle\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.966164 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-bxk6g" podStartSLOduration=2.6791214930000002 podStartE2EDuration="4.966147043s" podCreationTimestamp="2026-01-05 13:54:49 +0000 UTC" firstStartedPulling="2026-01-05 13:54:49.836083551 +0000 UTC m=+339.142992130" lastFinishedPulling="2026-01-05 13:54:52.123109101 +0000 UTC m=+341.430017680" observedRunningTime="2026-01-05 13:54:53.964627631 +0000 UTC m=+343.271536250" watchObservedRunningTime="2026-01-05 13:54:53.966147043 +0000 UTC m=+343.273055632" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.968770 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-serving-cert\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.978184 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhrq\" (UniqueName: \"kubernetes.io/projected/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-kube-api-access-srhrq\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:53 crc kubenswrapper[4740]: I0105 13:54:53.979664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-oauth-config\") pod \"console-6bd5cd66dc-t9skx\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.075118 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.340956 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-85559f775f-nmxz8"] Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.342262 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.345823 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.345886 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-f98gi9ph1jdkn" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.345955 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.346124 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.346298 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.348378 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-6qkcd" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359157 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-secret-metrics-server-tls\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359196 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-audit-log\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359214 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-metrics-server-audit-profiles\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359238 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79wq\" (UniqueName: \"kubernetes.io/projected/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-kube-api-access-j79wq\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359266 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-client-ca-bundle\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.359294 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-secret-metrics-client-certs\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.360656 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-85559f775f-nmxz8"] Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460200 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-secret-metrics-server-tls\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460245 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-audit-log\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460266 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-metrics-server-audit-profiles\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460282 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79wq\" (UniqueName: \"kubernetes.io/projected/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-kube-api-access-j79wq\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460309 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-client-ca-bundle\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460331 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-secret-metrics-client-certs\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.460366 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.461144 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.461702 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-metrics-server-audit-profiles\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.463088 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-audit-log\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.467003 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-client-ca-bundle\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.467049 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-secret-metrics-client-certs\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.469828 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-secret-metrics-server-tls\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.486433 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79wq\" (UniqueName: \"kubernetes.io/projected/8e7d78e7-6855-46c1-a51c-7f4127c80b7d-kube-api-access-j79wq\") pod \"metrics-server-85559f775f-nmxz8\" (UID: \"8e7d78e7-6855-46c1-a51c-7f4127c80b7d\") " pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.599236 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bd5cd66dc-t9skx"] Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.672096 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.743269 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4"] Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.744599 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.747340 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.747483 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.765806 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c85eb61e-dde6-42e7-b3d9-82837d0104d7-monitoring-plugin-cert\") pod \"monitoring-plugin-6fcb8d88f7-lcqs4\" (UID: \"c85eb61e-dde6-42e7-b3d9-82837d0104d7\") " pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.765995 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4"] Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.866995 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c85eb61e-dde6-42e7-b3d9-82837d0104d7-monitoring-plugin-cert\") pod \"monitoring-plugin-6fcb8d88f7-lcqs4\" (UID: \"c85eb61e-dde6-42e7-b3d9-82837d0104d7\") " pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:54 crc kubenswrapper[4740]: I0105 13:54:54.890753 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c85eb61e-dde6-42e7-b3d9-82837d0104d7-monitoring-plugin-cert\") pod \"monitoring-plugin-6fcb8d88f7-lcqs4\" (UID: \"c85eb61e-dde6-42e7-b3d9-82837d0104d7\") " pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.068016 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.355139 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.357677 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.360041 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.360326 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.360519 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.360743 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.360897 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.361337 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.361574 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.361728 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-thb9w" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.361926 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-enmknrfvu1mi3" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.362173 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.362345 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.371567 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374471 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af28d147-ae7a-4827-8d93-34cc6e211e3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374512 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374536 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374554 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374583 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374643 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374653 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374742 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af28d147-ae7a-4827-8d93-34cc6e211e3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374770 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374807 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374831 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8kvw\" (UniqueName: \"kubernetes.io/projected/af28d147-ae7a-4827-8d93-34cc6e211e3b-kube-api-access-t8kvw\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374877 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374906 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374966 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.374990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.375020 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.375054 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-config\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.437029 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476791 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476893 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476924 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476963 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-config\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.476990 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af28d147-ae7a-4827-8d93-34cc6e211e3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477017 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477045 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477083 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477118 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477155 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477175 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477202 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477226 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af28d147-ae7a-4827-8d93-34cc6e211e3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477250 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477277 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477299 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8kvw\" (UniqueName: \"kubernetes.io/projected/af28d147-ae7a-4827-8d93-34cc6e211e3b-kube-api-access-t8kvw\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.477663 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.478230 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.480590 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.480885 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.481784 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.482412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.482893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.483153 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.484679 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.485143 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.491268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-config\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.493580 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8kvw\" (UniqueName: \"kubernetes.io/projected/af28d147-ae7a-4827-8d93-34cc6e211e3b-kube-api-access-t8kvw\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.493806 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/af28d147-ae7a-4827-8d93-34cc6e211e3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.493863 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.494263 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/af28d147-ae7a-4827-8d93-34cc6e211e3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.494447 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.494664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/af28d147-ae7a-4827-8d93-34cc6e211e3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.495772 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/af28d147-ae7a-4827-8d93-34cc6e211e3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"af28d147-ae7a-4827-8d93-34cc6e211e3b\") " pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.744383 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.935856 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"627c026bf6f872a0263030cbcb6c3fb26fb9db6064776e7139b66837c2c04ec8"} Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.937377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd5cd66dc-t9skx" event={"ID":"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1","Type":"ContainerStarted","Data":"dbc676935e4b51a190056a0266fba09f1808de3fe2c86c7f93ab638901469785"} Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.937420 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd5cd66dc-t9skx" event={"ID":"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1","Type":"ContainerStarted","Data":"806c1e28d938aa0bdf19e752aaaf3e23a3ffac2398b7c5e36e49b247462b0299"} Jan 05 13:54:55 crc kubenswrapper[4740]: I0105 13:54:55.961057 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bd5cd66dc-t9skx" podStartSLOduration=2.961038135 podStartE2EDuration="2.961038135s" podCreationTimestamp="2026-01-05 13:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:54:55.955765048 +0000 UTC m=+345.262673647" watchObservedRunningTime="2026-01-05 13:54:55.961038135 +0000 UTC m=+345.267946714" Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.145975 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-85559f775f-nmxz8"] Jan 05 13:54:56 crc kubenswrapper[4740]: W0105 13:54:56.159354 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7d78e7_6855_46c1_a51c_7f4127c80b7d.slice/crio-f641f839040883b8818a9916bfe86b70c7c40711e2cae5c4effacb17108f7657 WatchSource:0}: Error finding container f641f839040883b8818a9916bfe86b70c7c40711e2cae5c4effacb17108f7657: Status 404 returned error can't find the container with id f641f839040883b8818a9916bfe86b70c7c40711e2cae5c4effacb17108f7657 Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.160738 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4"] Jan 05 13:54:56 crc kubenswrapper[4740]: W0105 13:54:56.167678 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85eb61e_dde6_42e7_b3d9_82837d0104d7.slice/crio-7a8a336536a2554075b37819c118f62e878ac2df54737b808a826c28da8cf5d2 WatchSource:0}: Error finding container 7a8a336536a2554075b37819c118f62e878ac2df54737b808a826c28da8cf5d2: Status 404 returned error can't find the container with id 7a8a336536a2554075b37819c118f62e878ac2df54737b808a826c28da8cf5d2 Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.250588 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Jan 05 13:54:56 crc kubenswrapper[4740]: W0105 13:54:56.254499 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf28d147_ae7a_4827_8d93_34cc6e211e3b.slice/crio-5f9400c57f457f2cec30cec4fe24ff3f0b6b624f1703541227abcc7e00c8a70a WatchSource:0}: Error finding container 5f9400c57f457f2cec30cec4fe24ff3f0b6b624f1703541227abcc7e00c8a70a: Status 404 returned error can't find the container with id 5f9400c57f457f2cec30cec4fe24ff3f0b6b624f1703541227abcc7e00c8a70a Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.957418 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" event={"ID":"c85eb61e-dde6-42e7-b3d9-82837d0104d7","Type":"ContainerStarted","Data":"7a8a336536a2554075b37819c118f62e878ac2df54737b808a826c28da8cf5d2"} Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.959487 4740 generic.go:334] "Generic (PLEG): container finished" podID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerID="db2c404134672718a3de3472c3c53bbec75e93e471ba88f48ddfa3723ff142f3" exitCode=0 Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.959528 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerDied","Data":"db2c404134672718a3de3472c3c53bbec75e93e471ba88f48ddfa3723ff142f3"} Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.959557 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"5f9400c57f457f2cec30cec4fe24ff3f0b6b624f1703541227abcc7e00c8a70a"} Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.964567 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"f0dbcc4c89837e8ef2acb05f0f876bfea92f51abe9f4073bca92b8d175b1a9fb"} Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.964632 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"c753b59cd634e626da9e6d5c4d76e6c0d09d2797763c1276ea69fcf29b8f9b47"} Jan 05 13:54:56 crc kubenswrapper[4740]: I0105 13:54:56.979484 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" event={"ID":"8e7d78e7-6855-46c1-a51c-7f4127c80b7d","Type":"ContainerStarted","Data":"f641f839040883b8818a9916bfe86b70c7c40711e2cae5c4effacb17108f7657"} Jan 05 13:54:57 crc kubenswrapper[4740]: I0105 13:54:57.978007 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"c2e377a2d56c0f4f900dde6faee753d74afa88e7c64c97d1b6764a0c8fc9fba4"} Jan 05 13:54:57 crc kubenswrapper[4740]: I0105 13:54:57.978420 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"46992d727c8ea4c1ef74226fa252669bcaf8da13360b071327887215c0483349"} Jan 05 13:54:57 crc kubenswrapper[4740]: I0105 13:54:57.978440 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"e47cc60cd9b91e73254b45e763c85a820fae4011167070addb27f67da18cfa49"} Jan 05 13:54:57 crc kubenswrapper[4740]: I0105 13:54:57.980139 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"7bf1444b6fa3dc362764b12308946b8601be8b9416829faebd9afc8a84e441b2"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.989773 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"c7d400b157467134a298af78037310a5bf18702fcc47ae5decf18222f60e6375"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.990214 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"841f53281a710556e1d9adb2d45626953f8b5a962d01de530774290831be13b5"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.990225 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88defe7b-d680-4acf-880d-ac342e7f0a73","Type":"ContainerStarted","Data":"c4c128a7445903835add09b77d42b522cdcf9c2fac6413113298ca7e555a0486"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.995875 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"fb718d00b279bb5e1c82d6900bdd806aa9f528dd5c31c1c67879a63884777080"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.995925 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" event={"ID":"3c267f0a-b0bb-43fe-9a21-92472096a632","Type":"ContainerStarted","Data":"edf194ac3143ab4b65a7dbf5a255f9e61c2a3afedaf5a0734b6c2842de201810"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.996078 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.997614 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" event={"ID":"c85eb61e-dde6-42e7-b3d9-82837d0104d7","Type":"ContainerStarted","Data":"d121c5eaeea818ac6a32101795d38a7fa92353aee20cf26059ca337378572f00"} Jan 05 13:54:58 crc kubenswrapper[4740]: I0105 13:54:58.997781 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:59 crc kubenswrapper[4740]: I0105 13:54:59.000799 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" event={"ID":"8e7d78e7-6855-46c1-a51c-7f4127c80b7d","Type":"ContainerStarted","Data":"9d5fe795fa0a598ed45200254e5e10188991bf9d527185dba340fbf72b4a0bb8"} Jan 05 13:54:59 crc kubenswrapper[4740]: I0105 13:54:59.005211 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 13:54:59 crc kubenswrapper[4740]: I0105 13:54:59.036768 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.940543761 podStartE2EDuration="9.036750408s" podCreationTimestamp="2026-01-05 13:54:50 +0000 UTC" firstStartedPulling="2026-01-05 13:54:50.813460125 +0000 UTC m=+340.120368704" lastFinishedPulling="2026-01-05 13:54:56.909666732 +0000 UTC m=+346.216575351" observedRunningTime="2026-01-05 13:54:59.035663678 +0000 UTC m=+348.342572297" watchObservedRunningTime="2026-01-05 13:54:59.036750408 +0000 UTC m=+348.343658997" Jan 05 13:54:59 crc kubenswrapper[4740]: I0105 13:54:59.076883 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" podStartSLOduration=2.960981865 podStartE2EDuration="5.07686517s" podCreationTimestamp="2026-01-05 13:54:54 +0000 UTC" firstStartedPulling="2026-01-05 13:54:56.161418879 +0000 UTC m=+345.468327468" lastFinishedPulling="2026-01-05 13:54:58.277302194 +0000 UTC m=+347.584210773" observedRunningTime="2026-01-05 13:54:59.070813183 +0000 UTC m=+348.377721772" watchObservedRunningTime="2026-01-05 13:54:59.07686517 +0000 UTC m=+348.383773759" Jan 05 13:54:59 crc kubenswrapper[4740]: I0105 13:54:59.087767 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" podStartSLOduration=2.991135511 podStartE2EDuration="5.087742902s" podCreationTimestamp="2026-01-05 13:54:54 +0000 UTC" firstStartedPulling="2026-01-05 13:54:56.169708809 +0000 UTC m=+345.476617388" lastFinishedPulling="2026-01-05 13:54:58.2663162 +0000 UTC m=+347.573224779" observedRunningTime="2026-01-05 13:54:59.084224664 +0000 UTC m=+348.391133283" watchObservedRunningTime="2026-01-05 13:54:59.087742902 +0000 UTC m=+348.394651491" Jan 05 13:54:59 crc kubenswrapper[4740]: I0105 13:54:59.128817 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" podStartSLOduration=3.428227601 podStartE2EDuration="8.128800409s" podCreationTimestamp="2026-01-05 13:54:51 +0000 UTC" firstStartedPulling="2026-01-05 13:54:52.753118075 +0000 UTC m=+342.060026664" lastFinishedPulling="2026-01-05 13:54:57.453690893 +0000 UTC m=+346.760599472" observedRunningTime="2026-01-05 13:54:59.120302404 +0000 UTC m=+348.427210983" watchObservedRunningTime="2026-01-05 13:54:59.128800409 +0000 UTC m=+348.435708988" Jan 05 13:55:00 crc kubenswrapper[4740]: I0105 13:55:00.016649 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" Jan 05 13:55:01 crc kubenswrapper[4740]: I0105 13:55:01.022908 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"00f1118de9d70d3a888ecec435afe5f0754cd57002b6cd52f433f321e949721f"} Jan 05 13:55:02 crc kubenswrapper[4740]: I0105 13:55:02.033236 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"d7854ae652dc9b15969aad18d67b010fdbcd1bd9280f8a117be683ae906f587c"} Jan 05 13:55:02 crc kubenswrapper[4740]: I0105 13:55:02.033534 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"aee583f91c12690bdc9d56b8b51d3521fac545954f9438d3d70c9ce5fae647b9"} Jan 05 13:55:02 crc kubenswrapper[4740]: I0105 13:55:02.033546 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"cc49789cbef197620c30fdb1c38fd66016b96b1e64bfd332100a6c1e38e2ee35"} Jan 05 13:55:02 crc kubenswrapper[4740]: I0105 13:55:02.033555 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"02e9014a2deb4c9d6a5cd07b04937281f886768f2f4d40d214bab60cc32b1b26"} Jan 05 13:55:02 crc kubenswrapper[4740]: I0105 13:55:02.033564 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"af28d147-ae7a-4827-8d93-34cc6e211e3b","Type":"ContainerStarted","Data":"daff6f4e58ef191622013e896a69e53159d83e0ae55336ebc12288087b1957e9"} Jan 05 13:55:02 crc kubenswrapper[4740]: I0105 13:55:02.081240 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.444615413 podStartE2EDuration="7.081215025s" podCreationTimestamp="2026-01-05 13:54:55 +0000 UTC" firstStartedPulling="2026-01-05 13:54:56.961577681 +0000 UTC m=+346.268486290" lastFinishedPulling="2026-01-05 13:55:00.598177323 +0000 UTC m=+349.905085902" observedRunningTime="2026-01-05 13:55:02.076999248 +0000 UTC m=+351.383907847" watchObservedRunningTime="2026-01-05 13:55:02.081215025 +0000 UTC m=+351.388123614" Jan 05 13:55:04 crc kubenswrapper[4740]: I0105 13:55:04.075901 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:55:04 crc kubenswrapper[4740]: I0105 13:55:04.075982 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:55:04 crc kubenswrapper[4740]: I0105 13:55:04.083814 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:55:05 crc kubenswrapper[4740]: I0105 13:55:05.061331 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:55:05 crc kubenswrapper[4740]: I0105 13:55:05.153125 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qj9kj"] Jan 05 13:55:05 crc kubenswrapper[4740]: I0105 13:55:05.745265 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:55:14 crc kubenswrapper[4740]: I0105 13:55:14.672635 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:55:14 crc kubenswrapper[4740]: I0105 13:55:14.673021 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:55:30 crc kubenswrapper[4740]: I0105 13:55:30.212595 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-qj9kj" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerName="console" containerID="cri-o://308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89" gracePeriod=15 Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.210226 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qj9kj_039f49cf-6394-4a22-ba5b-e5b681a51ca6/console/0.log" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.210689 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.266016 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qj9kj_039f49cf-6394-4a22-ba5b-e5b681a51ca6/console/0.log" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.266095 4740 generic.go:334] "Generic (PLEG): container finished" podID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerID="308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89" exitCode=2 Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.266130 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qj9kj" event={"ID":"039f49cf-6394-4a22-ba5b-e5b681a51ca6","Type":"ContainerDied","Data":"308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89"} Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.266159 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qj9kj" event={"ID":"039f49cf-6394-4a22-ba5b-e5b681a51ca6","Type":"ContainerDied","Data":"5f86689a6ce4cab1d74ce74a7ca2c537828587ffc1ef96a477fb74ee01f31d28"} Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.266170 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qj9kj" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.266180 4740 scope.go:117] "RemoveContainer" containerID="308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.290621 4740 scope.go:117] "RemoveContainer" containerID="308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89" Jan 05 13:55:31 crc kubenswrapper[4740]: E0105 13:55:31.291419 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89\": container with ID starting with 308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89 not found: ID does not exist" containerID="308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.291550 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89"} err="failed to get container status \"308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89\": rpc error: code = NotFound desc = could not find container \"308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89\": container with ID starting with 308db88f61949f73e11ad38650eae3962ede34343072bf590a3396e71131bd89 not found: ID does not exist" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.318207 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-oauth-serving-cert\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.318536 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfsl2\" (UniqueName: \"kubernetes.io/projected/039f49cf-6394-4a22-ba5b-e5b681a51ca6-kube-api-access-mfsl2\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.318727 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-oauth-config\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.318865 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-service-ca\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.318972 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-trusted-ca-bundle\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319108 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-serving-cert\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319217 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-config\") pod \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\" (UID: \"039f49cf-6394-4a22-ba5b-e5b681a51ca6\") " Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319399 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-service-ca" (OuterVolumeSpecName: "service-ca") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319671 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319787 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.319807 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-config" (OuterVolumeSpecName: "console-config") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.325584 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.328039 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039f49cf-6394-4a22-ba5b-e5b681a51ca6-kube-api-access-mfsl2" (OuterVolumeSpecName: "kube-api-access-mfsl2") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "kube-api-access-mfsl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.330331 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "039f49cf-6394-4a22-ba5b-e5b681a51ca6" (UID: "039f49cf-6394-4a22-ba5b-e5b681a51ca6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.420980 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.421011 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.421022 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.421030 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-console-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.421039 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/039f49cf-6394-4a22-ba5b-e5b681a51ca6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.421047 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfsl2\" (UniqueName: \"kubernetes.io/projected/039f49cf-6394-4a22-ba5b-e5b681a51ca6-kube-api-access-mfsl2\") on node \"crc\" DevicePath \"\"" Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.620923 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qj9kj"] Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.630642 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-qj9kj"] Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.916028 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:55:31 crc kubenswrapper[4740]: I0105 13:55:31.916118 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:55:32 crc kubenswrapper[4740]: I0105 13:55:32.984135 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" path="/var/lib/kubelet/pods/039f49cf-6394-4a22-ba5b-e5b681a51ca6/volumes" Jan 05 13:55:34 crc kubenswrapper[4740]: I0105 13:55:34.678567 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:55:34 crc kubenswrapper[4740]: I0105 13:55:34.682592 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" Jan 05 13:55:55 crc kubenswrapper[4740]: I0105 13:55:55.745026 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:55:55 crc kubenswrapper[4740]: I0105 13:55:55.790822 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:55:56 crc kubenswrapper[4740]: I0105 13:55:56.515834 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Jan 05 13:56:01 crc kubenswrapper[4740]: I0105 13:56:01.916696 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:56:01 crc kubenswrapper[4740]: I0105 13:56:01.917473 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.508582 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-75f75bdd6-kzlkk"] Jan 05 13:56:08 crc kubenswrapper[4740]: E0105 13:56:08.509872 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerName="console" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.509892 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerName="console" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.510136 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="039f49cf-6394-4a22-ba5b-e5b681a51ca6" containerName="console" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.510820 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.526008 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75f75bdd6-kzlkk"] Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.669797 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-service-ca\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.669930 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjnvs\" (UniqueName: \"kubernetes.io/projected/8e9a546c-882a-4c69-b739-d682512dd39e-kube-api-access-hjnvs\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.670018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-oauth-config\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.670216 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-console-config\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.670252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-oauth-serving-cert\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.670289 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-serving-cert\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.670339 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-trusted-ca-bundle\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.772233 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-oauth-config\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.772364 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-console-config\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.772397 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-oauth-serving-cert\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.772432 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-serving-cert\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.772485 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-trusted-ca-bundle\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.773603 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-console-config\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.773914 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-service-ca\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.773935 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-oauth-serving-cert\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.774678 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjnvs\" (UniqueName: \"kubernetes.io/projected/8e9a546c-882a-4c69-b739-d682512dd39e-kube-api-access-hjnvs\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.774827 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-trusted-ca-bundle\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.775902 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-service-ca\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.780307 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-serving-cert\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.780684 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-oauth-config\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.796831 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjnvs\" (UniqueName: \"kubernetes.io/projected/8e9a546c-882a-4c69-b739-d682512dd39e-kube-api-access-hjnvs\") pod \"console-75f75bdd6-kzlkk\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:08 crc kubenswrapper[4740]: I0105 13:56:08.865641 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:09 crc kubenswrapper[4740]: I0105 13:56:09.168316 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75f75bdd6-kzlkk"] Jan 05 13:56:09 crc kubenswrapper[4740]: I0105 13:56:09.567793 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f75bdd6-kzlkk" event={"ID":"8e9a546c-882a-4c69-b739-d682512dd39e","Type":"ContainerStarted","Data":"df1063e86f8d755dfa3e8556f20d16100e784c7c0c12876cfe4c1c7e48d828eb"} Jan 05 13:56:09 crc kubenswrapper[4740]: I0105 13:56:09.568392 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f75bdd6-kzlkk" event={"ID":"8e9a546c-882a-4c69-b739-d682512dd39e","Type":"ContainerStarted","Data":"635cbce9ed95f147e57a14482c77e3215fe350644e187f919690321ccdfeb1ac"} Jan 05 13:56:09 crc kubenswrapper[4740]: I0105 13:56:09.591988 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75f75bdd6-kzlkk" podStartSLOduration=1.591967895 podStartE2EDuration="1.591967895s" podCreationTimestamp="2026-01-05 13:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:56:09.590496545 +0000 UTC m=+418.897405134" watchObservedRunningTime="2026-01-05 13:56:09.591967895 +0000 UTC m=+418.898876494" Jan 05 13:56:11 crc kubenswrapper[4740]: I0105 13:56:11.364905 4740 scope.go:117] "RemoveContainer" containerID="5009012fd01d883db1c9431ad61841bb79087491f8055c89eea17d11cc2b6f18" Jan 05 13:56:11 crc kubenswrapper[4740]: I0105 13:56:11.399555 4740 scope.go:117] "RemoveContainer" containerID="4ff63df9dc8111a76a79899c7d8931edc8bc2574a9da763bdad4b3b0691ea8d1" Jan 05 13:56:18 crc kubenswrapper[4740]: I0105 13:56:18.866867 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:18 crc kubenswrapper[4740]: I0105 13:56:18.867492 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:18 crc kubenswrapper[4740]: I0105 13:56:18.880047 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:19 crc kubenswrapper[4740]: I0105 13:56:19.665266 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 13:56:19 crc kubenswrapper[4740]: I0105 13:56:19.758860 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bd5cd66dc-t9skx"] Jan 05 13:56:31 crc kubenswrapper[4740]: I0105 13:56:31.915952 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:56:31 crc kubenswrapper[4740]: I0105 13:56:31.916673 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:56:31 crc kubenswrapper[4740]: I0105 13:56:31.916769 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 13:56:31 crc kubenswrapper[4740]: I0105 13:56:31.917782 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73243fe427b3c563b811bef4fe47899b7220b055a5ab1889d2817322cf522b18"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 13:56:31 crc kubenswrapper[4740]: I0105 13:56:31.917886 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://73243fe427b3c563b811bef4fe47899b7220b055a5ab1889d2817322cf522b18" gracePeriod=600 Jan 05 13:56:32 crc kubenswrapper[4740]: I0105 13:56:32.899255 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="73243fe427b3c563b811bef4fe47899b7220b055a5ab1889d2817322cf522b18" exitCode=0 Jan 05 13:56:32 crc kubenswrapper[4740]: I0105 13:56:32.899384 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"73243fe427b3c563b811bef4fe47899b7220b055a5ab1889d2817322cf522b18"} Jan 05 13:56:32 crc kubenswrapper[4740]: I0105 13:56:32.900266 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"c94fbb7c1e4a27a12915fd96b93743fa30aac1c7bed9369659cc71247bcbb496"} Jan 05 13:56:32 crc kubenswrapper[4740]: I0105 13:56:32.900313 4740 scope.go:117] "RemoveContainer" containerID="ce8190df163bf1923ad03250cabf835a3e8f9ecb64484dd6a124c97fc8435ba8" Jan 05 13:56:44 crc kubenswrapper[4740]: I0105 13:56:44.813692 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6bd5cd66dc-t9skx" podUID="e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" containerName="console" containerID="cri-o://dbc676935e4b51a190056a0266fba09f1808de3fe2c86c7f93ab638901469785" gracePeriod=15 Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.008585 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bd5cd66dc-t9skx_e0bc4b27-c349-44f2-85d2-cdf9b68fcda1/console/0.log" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.008678 4740 generic.go:334] "Generic (PLEG): container finished" podID="e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" containerID="dbc676935e4b51a190056a0266fba09f1808de3fe2c86c7f93ab638901469785" exitCode=2 Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.008726 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd5cd66dc-t9skx" event={"ID":"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1","Type":"ContainerDied","Data":"dbc676935e4b51a190056a0266fba09f1808de3fe2c86c7f93ab638901469785"} Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.208704 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bd5cd66dc-t9skx_e0bc4b27-c349-44f2-85d2-cdf9b68fcda1/console/0.log" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.208767 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346191 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-config\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346670 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srhrq\" (UniqueName: \"kubernetes.io/projected/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-kube-api-access-srhrq\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346791 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-oauth-serving-cert\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346843 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-serving-cert\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346871 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-service-ca\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346934 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-oauth-config\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.346997 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-trusted-ca-bundle\") pod \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\" (UID: \"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1\") " Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.349436 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-service-ca" (OuterVolumeSpecName: "service-ca") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.349624 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.349491 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.349490 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-config" (OuterVolumeSpecName: "console-config") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.358967 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.360352 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.361349 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-kube-api-access-srhrq" (OuterVolumeSpecName: "kube-api-access-srhrq") pod "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" (UID: "e0bc4b27-c349-44f2-85d2-cdf9b68fcda1"). InnerVolumeSpecName "kube-api-access-srhrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448759 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448803 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448824 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448841 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448861 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448880 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-console-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:45 crc kubenswrapper[4740]: I0105 13:56:45.448897 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srhrq\" (UniqueName: \"kubernetes.io/projected/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1-kube-api-access-srhrq\") on node \"crc\" DevicePath \"\"" Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.021117 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bd5cd66dc-t9skx_e0bc4b27-c349-44f2-85d2-cdf9b68fcda1/console/0.log" Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.021261 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd5cd66dc-t9skx" event={"ID":"e0bc4b27-c349-44f2-85d2-cdf9b68fcda1","Type":"ContainerDied","Data":"806c1e28d938aa0bdf19e752aaaf3e23a3ffac2398b7c5e36e49b247462b0299"} Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.021442 4740 scope.go:117] "RemoveContainer" containerID="dbc676935e4b51a190056a0266fba09f1808de3fe2c86c7f93ab638901469785" Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.021534 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd5cd66dc-t9skx" Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.078293 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bd5cd66dc-t9skx"] Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.087805 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bd5cd66dc-t9skx"] Jan 05 13:56:46 crc kubenswrapper[4740]: I0105 13:56:46.982392 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" path="/var/lib/kubelet/pods/e0bc4b27-c349-44f2-85d2-cdf9b68fcda1/volumes" Jan 05 13:57:11 crc kubenswrapper[4740]: I0105 13:57:11.493655 4740 scope.go:117] "RemoveContainer" containerID="693a1594e8ad2aed0edca20c218625b41b6313f4fe8a1d011abc15898bb6a768" Jan 05 13:57:11 crc kubenswrapper[4740]: I0105 13:57:11.523595 4740 scope.go:117] "RemoveContainer" containerID="e8ffe2cc2e736cc7321dc8730c5bff5e77567a33f36365529087bd6a93a96a80" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.802694 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p"] Jan 05 13:57:19 crc kubenswrapper[4740]: E0105 13:57:19.803432 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" containerName="console" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.803453 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" containerName="console" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.803654 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bc4b27-c349-44f2-85d2-cdf9b68fcda1" containerName="console" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.804689 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.807093 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.815694 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p"] Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.933331 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxxjw\" (UniqueName: \"kubernetes.io/projected/f127a860-4853-4359-9cef-ec66add405e3-kube-api-access-mxxjw\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.933427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:19 crc kubenswrapper[4740]: I0105 13:57:19.933594 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.036802 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxxjw\" (UniqueName: \"kubernetes.io/projected/f127a860-4853-4359-9cef-ec66add405e3-kube-api-access-mxxjw\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.037319 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.037606 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.038298 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.038442 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.062918 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxxjw\" (UniqueName: \"kubernetes.io/projected/f127a860-4853-4359-9cef-ec66add405e3-kube-api-access-mxxjw\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.131954 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:20 crc kubenswrapper[4740]: I0105 13:57:20.440123 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p"] Jan 05 13:57:21 crc kubenswrapper[4740]: I0105 13:57:21.317932 4740 generic.go:334] "Generic (PLEG): container finished" podID="f127a860-4853-4359-9cef-ec66add405e3" containerID="d899373a9fc44630ed47ab86aeb6a42cb22f38f06f097c4f8fa34057fb340f4e" exitCode=0 Jan 05 13:57:21 crc kubenswrapper[4740]: I0105 13:57:21.318058 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" event={"ID":"f127a860-4853-4359-9cef-ec66add405e3","Type":"ContainerDied","Data":"d899373a9fc44630ed47ab86aeb6a42cb22f38f06f097c4f8fa34057fb340f4e"} Jan 05 13:57:21 crc kubenswrapper[4740]: I0105 13:57:21.318631 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" event={"ID":"f127a860-4853-4359-9cef-ec66add405e3","Type":"ContainerStarted","Data":"635d41ab11848e21522f14af48cb749aa26a6b88c84033754f4e8ac15f261a92"} Jan 05 13:57:21 crc kubenswrapper[4740]: I0105 13:57:21.323740 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 13:57:23 crc kubenswrapper[4740]: I0105 13:57:23.334637 4740 generic.go:334] "Generic (PLEG): container finished" podID="f127a860-4853-4359-9cef-ec66add405e3" containerID="09ea7ed4144e86a5167d8d5bf0214291205acda4932d1d00635e2098203304f3" exitCode=0 Jan 05 13:57:23 crc kubenswrapper[4740]: I0105 13:57:23.334793 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" event={"ID":"f127a860-4853-4359-9cef-ec66add405e3","Type":"ContainerDied","Data":"09ea7ed4144e86a5167d8d5bf0214291205acda4932d1d00635e2098203304f3"} Jan 05 13:57:24 crc kubenswrapper[4740]: I0105 13:57:24.346229 4740 generic.go:334] "Generic (PLEG): container finished" podID="f127a860-4853-4359-9cef-ec66add405e3" containerID="e7472cd7c362ed69d7d5940de993be8bcd3c5848c8fc0b7bb7f0b6202da9f460" exitCode=0 Jan 05 13:57:24 crc kubenswrapper[4740]: I0105 13:57:24.346347 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" event={"ID":"f127a860-4853-4359-9cef-ec66add405e3","Type":"ContainerDied","Data":"e7472cd7c362ed69d7d5940de993be8bcd3c5848c8fc0b7bb7f0b6202da9f460"} Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.636576 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.658968 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-util\") pod \"f127a860-4853-4359-9cef-ec66add405e3\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.659077 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxxjw\" (UniqueName: \"kubernetes.io/projected/f127a860-4853-4359-9cef-ec66add405e3-kube-api-access-mxxjw\") pod \"f127a860-4853-4359-9cef-ec66add405e3\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.659106 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-bundle\") pod \"f127a860-4853-4359-9cef-ec66add405e3\" (UID: \"f127a860-4853-4359-9cef-ec66add405e3\") " Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.661552 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-bundle" (OuterVolumeSpecName: "bundle") pod "f127a860-4853-4359-9cef-ec66add405e3" (UID: "f127a860-4853-4359-9cef-ec66add405e3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.666658 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f127a860-4853-4359-9cef-ec66add405e3-kube-api-access-mxxjw" (OuterVolumeSpecName: "kube-api-access-mxxjw") pod "f127a860-4853-4359-9cef-ec66add405e3" (UID: "f127a860-4853-4359-9cef-ec66add405e3"). InnerVolumeSpecName "kube-api-access-mxxjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.672204 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-util" (OuterVolumeSpecName: "util") pod "f127a860-4853-4359-9cef-ec66add405e3" (UID: "f127a860-4853-4359-9cef-ec66add405e3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.760723 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxxjw\" (UniqueName: \"kubernetes.io/projected/f127a860-4853-4359-9cef-ec66add405e3-kube-api-access-mxxjw\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.760760 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:25 crc kubenswrapper[4740]: I0105 13:57:25.760772 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f127a860-4853-4359-9cef-ec66add405e3-util\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:26 crc kubenswrapper[4740]: I0105 13:57:26.363948 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" event={"ID":"f127a860-4853-4359-9cef-ec66add405e3","Type":"ContainerDied","Data":"635d41ab11848e21522f14af48cb749aa26a6b88c84033754f4e8ac15f261a92"} Jan 05 13:57:26 crc kubenswrapper[4740]: I0105 13:57:26.364032 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635d41ab11848e21522f14af48cb749aa26a6b88c84033754f4e8ac15f261a92" Jan 05 13:57:26 crc kubenswrapper[4740]: I0105 13:57:26.364107 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p" Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.990158 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-btftp"] Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992621 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-controller" containerID="cri-o://67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4" gracePeriod=30 Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992690 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="nbdb" containerID="cri-o://b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261" gracePeriod=30 Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992728 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490" gracePeriod=30 Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992790 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="northd" containerID="cri-o://512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25" gracePeriod=30 Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992775 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-node" containerID="cri-o://3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3" gracePeriod=30 Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992845 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-acl-logging" containerID="cri-o://861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4" gracePeriod=30 Jan 05 13:57:30 crc kubenswrapper[4740]: I0105 13:57:30.992869 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="sbdb" containerID="cri-o://43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da" gracePeriod=30 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.030582 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovnkube-controller" containerID="cri-o://8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302" gracePeriod=30 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.403781 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btftp_0aefcfb9-fb79-4a82-a41c-01a94544f6f6/ovn-acl-logging/0.log" Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.404386 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btftp_0aefcfb9-fb79-4a82-a41c-01a94544f6f6/ovn-controller/0.log" Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405004 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302" exitCode=0 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405047 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da" exitCode=0 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405111 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261" exitCode=0 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405127 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25" exitCode=0 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405118 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405178 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405203 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405142 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4" exitCode=143 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405276 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4" exitCode=143 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405259 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.405377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.407888 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tfv_11b442ff-cefe-4a62-bd99-da39c470692e/kube-multus/0.log" Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.407947 4740 generic.go:334] "Generic (PLEG): container finished" podID="11b442ff-cefe-4a62-bd99-da39c470692e" containerID="aec9b04ea574c726c59aca5fe6dcbe0cb011dcb7e240cdea847450d2d54e7f4b" exitCode=2 Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.407982 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tfv" event={"ID":"11b442ff-cefe-4a62-bd99-da39c470692e","Type":"ContainerDied","Data":"aec9b04ea574c726c59aca5fe6dcbe0cb011dcb7e240cdea847450d2d54e7f4b"} Jan 05 13:57:31 crc kubenswrapper[4740]: I0105 13:57:31.408672 4740 scope.go:117] "RemoveContainer" containerID="aec9b04ea574c726c59aca5fe6dcbe0cb011dcb7e240cdea847450d2d54e7f4b" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.286764 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btftp_0aefcfb9-fb79-4a82-a41c-01a94544f6f6/ovn-acl-logging/0.log" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.287693 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btftp_0aefcfb9-fb79-4a82-a41c-01a94544f6f6/ovn-controller/0.log" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.288103 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366094 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mwwz8"] Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366305 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovnkube-controller" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366316 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovnkube-controller" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366329 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="pull" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366337 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="pull" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366347 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="util" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366352 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="util" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366364 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-controller" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366369 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-controller" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366377 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-ovn-metrics" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366383 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-ovn-metrics" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366392 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="extract" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366398 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="extract" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366407 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-node" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366413 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-node" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366420 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="northd" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366425 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="northd" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366431 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="nbdb" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366437 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="nbdb" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366450 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kubecfg-setup" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366456 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kubecfg-setup" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366464 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="sbdb" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366469 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="sbdb" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.366478 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-acl-logging" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366484 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-acl-logging" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366582 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-controller" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366593 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-ovn-metrics" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366604 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f127a860-4853-4359-9cef-ec66add405e3" containerName="extract" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366610 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovnkube-controller" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366617 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="northd" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366624 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="ovn-acl-logging" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366634 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="nbdb" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366641 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="kube-rbac-proxy-node" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.366648 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerName="sbdb" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.368663 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.416976 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btftp_0aefcfb9-fb79-4a82-a41c-01a94544f6f6/ovn-acl-logging/0.log" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417428 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-btftp_0aefcfb9-fb79-4a82-a41c-01a94544f6f6/ovn-controller/0.log" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417804 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490" exitCode=0 Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417855 4740 generic.go:334] "Generic (PLEG): container finished" podID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" containerID="3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3" exitCode=0 Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417876 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417879 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490"} Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417914 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3"} Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417929 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-btftp" event={"ID":"0aefcfb9-fb79-4a82-a41c-01a94544f6f6","Type":"ContainerDied","Data":"e73f42ee7b6083306d621aa9e56661b99414dd573ac7f19c7043368a4f336d89"} Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.417947 4740 scope.go:117] "RemoveContainer" containerID="8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.419898 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tfv_11b442ff-cefe-4a62-bd99-da39c470692e/kube-multus/0.log" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.419931 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tfv" event={"ID":"11b442ff-cefe-4a62-bd99-da39c470692e","Type":"ContainerStarted","Data":"05bc1c38cd8efa3228df1ed3f0d6e2580e0f27838f23200b897122a5c36963fb"} Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.439791 4740 scope.go:117] "RemoveContainer" containerID="43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.464791 4740 scope.go:117] "RemoveContainer" containerID="b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.467857 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-netd\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.467893 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-systemd-units\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.467927 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-script-lib\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.467949 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-config\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.467942 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.467971 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-openvswitch\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468000 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovn-node-metrics-cert\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468016 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-node-log\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468030 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-log-socket\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468057 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468096 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-var-lib-openvswitch\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468116 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-env-overrides\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468115 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468139 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-netns\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468146 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-node-log" (OuterVolumeSpecName: "node-log") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468172 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468194 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-log-socket" (OuterVolumeSpecName: "log-socket") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468211 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468214 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-kubelet\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468229 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468262 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-slash\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468297 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-ovn-kubernetes\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468333 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-etc-openvswitch\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468365 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468396 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468420 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-slash" (OuterVolumeSpecName: "host-slash") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468439 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-bin\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468442 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468483 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468498 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468514 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468433 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468647 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-ovn\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468680 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hltxk\" (UniqueName: \"kubernetes.io/projected/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-kube-api-access-hltxk\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468701 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-systemd\") pod \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\" (UID: \"0aefcfb9-fb79-4a82-a41c-01a94544f6f6\") " Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468770 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.468992 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469613 4740 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469634 4740 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469644 4740 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469654 4740 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469663 4740 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-slash\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469671 4740 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469679 4740 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469687 4740 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469695 4740 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469702 4740 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469710 4740 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469718 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469726 4740 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469735 4740 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469742 4740 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-node-log\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469750 4740 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-log-socket\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.469758 4740 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.475659 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.482265 4740 scope.go:117] "RemoveContainer" containerID="512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.484640 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-kube-api-access-hltxk" (OuterVolumeSpecName: "kube-api-access-hltxk") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "kube-api-access-hltxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.488415 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0aefcfb9-fb79-4a82-a41c-01a94544f6f6" (UID: "0aefcfb9-fb79-4a82-a41c-01a94544f6f6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.502489 4740 scope.go:117] "RemoveContainer" containerID="fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.513442 4740 scope.go:117] "RemoveContainer" containerID="3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.528268 4740 scope.go:117] "RemoveContainer" containerID="861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.548648 4740 scope.go:117] "RemoveContainer" containerID="67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570533 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-slash\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570577 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-run-netns\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570599 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-log-socket\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570662 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-node-log\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570684 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovnkube-script-lib\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570728 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-kubelet\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570784 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovnkube-config\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570809 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-etc-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570822 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-cni-netd\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570872 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-env-overrides\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570889 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570902 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovn-node-metrics-cert\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570947 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9d5d\" (UniqueName: \"kubernetes.io/projected/3b6933aa-4b90-4f86-80bb-007f3b987d77-kube-api-access-k9d5d\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.570964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-var-lib-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571013 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571116 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-systemd-units\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571150 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-ovn\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571165 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-systemd\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571213 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571237 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-cni-bin\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571300 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hltxk\" (UniqueName: \"kubernetes.io/projected/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-kube-api-access-hltxk\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571311 4740 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.571342 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aefcfb9-fb79-4a82-a41c-01a94544f6f6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.572360 4740 scope.go:117] "RemoveContainer" containerID="9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.604043 4740 scope.go:117] "RemoveContainer" containerID="8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.607488 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302\": container with ID starting with 8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302 not found: ID does not exist" containerID="8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.607518 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302"} err="failed to get container status \"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302\": rpc error: code = NotFound desc = could not find container \"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302\": container with ID starting with 8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.607538 4740 scope.go:117] "RemoveContainer" containerID="43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.609660 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da\": container with ID starting with 43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da not found: ID does not exist" containerID="43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.609686 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da"} err="failed to get container status \"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da\": rpc error: code = NotFound desc = could not find container \"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da\": container with ID starting with 43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.609698 4740 scope.go:117] "RemoveContainer" containerID="b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.611618 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261\": container with ID starting with b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261 not found: ID does not exist" containerID="b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.611641 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261"} err="failed to get container status \"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261\": rpc error: code = NotFound desc = could not find container \"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261\": container with ID starting with b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.611654 4740 scope.go:117] "RemoveContainer" containerID="512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.615314 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25\": container with ID starting with 512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25 not found: ID does not exist" containerID="512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.615337 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25"} err="failed to get container status \"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25\": rpc error: code = NotFound desc = could not find container \"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25\": container with ID starting with 512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.615351 4740 scope.go:117] "RemoveContainer" containerID="fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.619416 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490\": container with ID starting with fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490 not found: ID does not exist" containerID="fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.619444 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490"} err="failed to get container status \"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490\": rpc error: code = NotFound desc = could not find container \"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490\": container with ID starting with fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.619463 4740 scope.go:117] "RemoveContainer" containerID="3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.620895 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3\": container with ID starting with 3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3 not found: ID does not exist" containerID="3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.620919 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3"} err="failed to get container status \"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3\": rpc error: code = NotFound desc = could not find container \"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3\": container with ID starting with 3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.620933 4740 scope.go:117] "RemoveContainer" containerID="861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.624365 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4\": container with ID starting with 861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4 not found: ID does not exist" containerID="861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.624385 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4"} err="failed to get container status \"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4\": rpc error: code = NotFound desc = could not find container \"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4\": container with ID starting with 861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.624399 4740 scope.go:117] "RemoveContainer" containerID="67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.628350 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4\": container with ID starting with 67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4 not found: ID does not exist" containerID="67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.628373 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4"} err="failed to get container status \"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4\": rpc error: code = NotFound desc = could not find container \"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4\": container with ID starting with 67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.628385 4740 scope.go:117] "RemoveContainer" containerID="9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c" Jan 05 13:57:32 crc kubenswrapper[4740]: E0105 13:57:32.632381 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c\": container with ID starting with 9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c not found: ID does not exist" containerID="9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.632405 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c"} err="failed to get container status \"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c\": rpc error: code = NotFound desc = could not find container \"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c\": container with ID starting with 9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.632420 4740 scope.go:117] "RemoveContainer" containerID="8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.634596 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302"} err="failed to get container status \"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302\": rpc error: code = NotFound desc = could not find container \"8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302\": container with ID starting with 8d9955e6fd0efac01cd667d82fa3023362f3e5dd415154fe9a9fe1b33c35b302 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.634622 4740 scope.go:117] "RemoveContainer" containerID="43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.634812 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da"} err="failed to get container status \"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da\": rpc error: code = NotFound desc = could not find container \"43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da\": container with ID starting with 43b60f4fb8e8a8b0f42d7e0351e770870d70e2b71f6b8fab2a975a05c32ee1da not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.634830 4740 scope.go:117] "RemoveContainer" containerID="b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.645637 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261"} err="failed to get container status \"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261\": rpc error: code = NotFound desc = could not find container \"b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261\": container with ID starting with b7a58c37ad5e9bd53a570c2021f1140296c7329f5a06b6d4ccaff68570821261 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.645681 4740 scope.go:117] "RemoveContainer" containerID="512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.653856 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25"} err="failed to get container status \"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25\": rpc error: code = NotFound desc = could not find container \"512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25\": container with ID starting with 512178bdbd7bc4763b3be3556d977709a92365bfee016ecf1a26cfd7ad892a25 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.653902 4740 scope.go:117] "RemoveContainer" containerID="fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.656411 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490"} err="failed to get container status \"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490\": rpc error: code = NotFound desc = could not find container \"fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490\": container with ID starting with fb769266ddfcf629e18cce3ecb8dad101111eeb6e8751ca1613b99f224a67490 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.656465 4740 scope.go:117] "RemoveContainer" containerID="3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.656677 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3"} err="failed to get container status \"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3\": rpc error: code = NotFound desc = could not find container \"3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3\": container with ID starting with 3e884a7385ebd77390b692267e14d3ba17143bedc87fed822ac791c785f129c3 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.656706 4740 scope.go:117] "RemoveContainer" containerID="861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.656944 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4"} err="failed to get container status \"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4\": rpc error: code = NotFound desc = could not find container \"861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4\": container with ID starting with 861d2bd279356f522918b9631ce7f351d06b062654612377b80e09e61e01afa4 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.656984 4740 scope.go:117] "RemoveContainer" containerID="67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.657256 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4"} err="failed to get container status \"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4\": rpc error: code = NotFound desc = could not find container \"67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4\": container with ID starting with 67de69e16dbcd1b56f4b041f283bfc83c17747ff72309a34231c9b7c3055b3d4 not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.657274 4740 scope.go:117] "RemoveContainer" containerID="9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.657489 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c"} err="failed to get container status \"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c\": rpc error: code = NotFound desc = could not find container \"9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c\": container with ID starting with 9dea5e53734c6dadf190b9b09617cc1ed1995e4329c777c54169a843a75d292c not found: ID does not exist" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672164 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-ovn\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672204 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-systemd\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672228 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672250 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-cni-bin\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672270 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-slash\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672290 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-run-netns\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672298 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-ovn\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672321 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-systemd\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672306 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-log-socket\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672348 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-run-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672371 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-node-log\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672341 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-log-socket\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-run-netns\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672373 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-slash\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672358 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-cni-bin\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672442 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-node-log\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672493 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovnkube-script-lib\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672518 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-kubelet\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672532 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovnkube-config\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672548 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-etc-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672565 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-cni-netd\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672578 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-env-overrides\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672591 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672606 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovn-node-metrics-cert\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672620 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9d5d\" (UniqueName: \"kubernetes.io/projected/3b6933aa-4b90-4f86-80bb-007f3b987d77-kube-api-access-k9d5d\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672637 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-var-lib-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672659 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672689 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-systemd-units\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672739 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-systemd-units\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.672763 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-kubelet\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673156 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovnkube-script-lib\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673194 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673217 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-etc-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673239 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-cni-netd\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673296 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovnkube-config\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673514 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b6933aa-4b90-4f86-80bb-007f3b987d77-env-overrides\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673556 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.673560 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b6933aa-4b90-4f86-80bb-007f3b987d77-var-lib-openvswitch\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.676076 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b6933aa-4b90-4f86-80bb-007f3b987d77-ovn-node-metrics-cert\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.694026 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9d5d\" (UniqueName: \"kubernetes.io/projected/3b6933aa-4b90-4f86-80bb-007f3b987d77-kube-api-access-k9d5d\") pod \"ovnkube-node-mwwz8\" (UID: \"3b6933aa-4b90-4f86-80bb-007f3b987d77\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.750851 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-btftp"] Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.758212 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-btftp"] Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.975656 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aefcfb9-fb79-4a82-a41c-01a94544f6f6" path="/var/lib/kubelet/pods/0aefcfb9-fb79-4a82-a41c-01a94544f6f6/volumes" Jan 05 13:57:32 crc kubenswrapper[4740]: I0105 13:57:32.979995 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:32 crc kubenswrapper[4740]: W0105 13:57:32.995992 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6933aa_4b90_4f86_80bb_007f3b987d77.slice/crio-7b479d28f5ba6bf0653f37a7115367d3a545258a83e06aa29bc48ef980a597b0 WatchSource:0}: Error finding container 7b479d28f5ba6bf0653f37a7115367d3a545258a83e06aa29bc48ef980a597b0: Status 404 returned error can't find the container with id 7b479d28f5ba6bf0653f37a7115367d3a545258a83e06aa29bc48ef980a597b0 Jan 05 13:57:33 crc kubenswrapper[4740]: I0105 13:57:33.426885 4740 generic.go:334] "Generic (PLEG): container finished" podID="3b6933aa-4b90-4f86-80bb-007f3b987d77" containerID="65cfed5920556cb0f3e968019fe16c24faeb254d895b4570219a5205107d97e0" exitCode=0 Jan 05 13:57:33 crc kubenswrapper[4740]: I0105 13:57:33.426957 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerDied","Data":"65cfed5920556cb0f3e968019fe16c24faeb254d895b4570219a5205107d97e0"} Jan 05 13:57:33 crc kubenswrapper[4740]: I0105 13:57:33.427161 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"7b479d28f5ba6bf0653f37a7115367d3a545258a83e06aa29bc48ef980a597b0"} Jan 05 13:57:34 crc kubenswrapper[4740]: I0105 13:57:34.447891 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"250a8fa8061167fa833c59fd4ae2620c2daa1f5a765a6730a60a9930aaabe277"} Jan 05 13:57:34 crc kubenswrapper[4740]: I0105 13:57:34.448790 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"5964e21005a044f85da9eb51cfee78bf86dc9fb879d97927fecb93231ee76672"} Jan 05 13:57:34 crc kubenswrapper[4740]: I0105 13:57:34.448804 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"81e2043b49c277d162430244098dae07fbb44acfad673bb702fbc9407bd66103"} Jan 05 13:57:34 crc kubenswrapper[4740]: I0105 13:57:34.448817 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"4562f562e38d1a962e8d45a90cc9a2c81178929cf60c2bd934ee06489a229f7c"} Jan 05 13:57:34 crc kubenswrapper[4740]: I0105 13:57:34.448828 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"70a6daa3a76c080e89d156c79655de11869dc63ac0bbf43a75409828f77b197d"} Jan 05 13:57:34 crc kubenswrapper[4740]: I0105 13:57:34.448838 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"9a4b3bc429b9d5b1c83d20bc527e0a008ac7b7ecfee4ae044e7d17fe5c69bcfc"} Jan 05 13:57:36 crc kubenswrapper[4740]: I0105 13:57:36.908402 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7"] Jan 05 13:57:36 crc kubenswrapper[4740]: I0105 13:57:36.909639 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:36 crc kubenswrapper[4740]: I0105 13:57:36.911650 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-9bv6d" Jan 05 13:57:36 crc kubenswrapper[4740]: I0105 13:57:36.911863 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 05 13:57:36 crc kubenswrapper[4740]: I0105 13:57:36.911973 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 05 13:57:36 crc kubenswrapper[4740]: I0105 13:57:36.932377 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnpzd\" (UniqueName: \"kubernetes.io/projected/abe8ad9d-d5fe-46ef-8220-0d45b4f077b2-kube-api-access-wnpzd\") pod \"obo-prometheus-operator-68bc856cb9-9jwt7\" (UID: \"abe8ad9d-d5fe-46ef-8220-0d45b4f077b2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.033412 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnpzd\" (UniqueName: \"kubernetes.io/projected/abe8ad9d-d5fe-46ef-8220-0d45b4f077b2-kube-api-access-wnpzd\") pod \"obo-prometheus-operator-68bc856cb9-9jwt7\" (UID: \"abe8ad9d-d5fe-46ef-8220-0d45b4f077b2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.042751 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4"] Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.043589 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.045384 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-bcqwv" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.048310 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.057418 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw"] Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.058215 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.058839 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnpzd\" (UniqueName: \"kubernetes.io/projected/abe8ad9d-d5fe-46ef-8220-0d45b4f077b2-kube-api-access-wnpzd\") pod \"obo-prometheus-operator-68bc856cb9-9jwt7\" (UID: \"abe8ad9d-d5fe-46ef-8220-0d45b4f077b2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.135415 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c9f195d7-1f66-4278-98bf-4ed7bdbb42a1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4\" (UID: \"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.135496 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f63eb54a-2bd6-4366-a206-095360d8b368-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw\" (UID: \"f63eb54a-2bd6-4366-a206-095360d8b368\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.135530 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f63eb54a-2bd6-4366-a206-095360d8b368-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw\" (UID: \"f63eb54a-2bd6-4366-a206-095360d8b368\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.135556 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c9f195d7-1f66-4278-98bf-4ed7bdbb42a1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4\" (UID: \"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.144346 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lfg5l"] Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.145057 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.146439 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-t5h4h" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.146933 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.225882 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.239758 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c9f195d7-1f66-4278-98bf-4ed7bdbb42a1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4\" (UID: \"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.239825 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/45f52c16-f526-4498-bc85-2aec3b292a60-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lfg5l\" (UID: \"45f52c16-f526-4498-bc85-2aec3b292a60\") " pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.239889 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f63eb54a-2bd6-4366-a206-095360d8b368-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw\" (UID: \"f63eb54a-2bd6-4366-a206-095360d8b368\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.239939 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f63eb54a-2bd6-4366-a206-095360d8b368-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw\" (UID: \"f63eb54a-2bd6-4366-a206-095360d8b368\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.239979 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bb2q\" (UniqueName: \"kubernetes.io/projected/45f52c16-f526-4498-bc85-2aec3b292a60-kube-api-access-6bb2q\") pod \"observability-operator-59bdc8b94-lfg5l\" (UID: \"45f52c16-f526-4498-bc85-2aec3b292a60\") " pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.240008 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c9f195d7-1f66-4278-98bf-4ed7bdbb42a1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4\" (UID: \"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.248571 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f63eb54a-2bd6-4366-a206-095360d8b368-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw\" (UID: \"f63eb54a-2bd6-4366-a206-095360d8b368\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.248584 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f63eb54a-2bd6-4366-a206-095360d8b368-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw\" (UID: \"f63eb54a-2bd6-4366-a206-095360d8b368\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.251487 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c9f195d7-1f66-4278-98bf-4ed7bdbb42a1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4\" (UID: \"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.263685 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c9f195d7-1f66-4278-98bf-4ed7bdbb42a1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4\" (UID: \"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.273668 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(26a76d309aa4b64308faa9c2a3b94ce07fc902f68e323272f1e622421056d609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.273742 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(26a76d309aa4b64308faa9c2a3b94ce07fc902f68e323272f1e622421056d609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.273761 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(26a76d309aa4b64308faa9c2a3b94ce07fc902f68e323272f1e622421056d609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.273806 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators(abe8ad9d-d5fe-46ef-8220-0d45b4f077b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators(abe8ad9d-d5fe-46ef-8220-0d45b4f077b2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(26a76d309aa4b64308faa9c2a3b94ce07fc902f68e323272f1e622421056d609): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" podUID="abe8ad9d-d5fe-46ef-8220-0d45b4f077b2" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.341570 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bb2q\" (UniqueName: \"kubernetes.io/projected/45f52c16-f526-4498-bc85-2aec3b292a60-kube-api-access-6bb2q\") pod \"observability-operator-59bdc8b94-lfg5l\" (UID: \"45f52c16-f526-4498-bc85-2aec3b292a60\") " pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.341688 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/45f52c16-f526-4498-bc85-2aec3b292a60-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lfg5l\" (UID: \"45f52c16-f526-4498-bc85-2aec3b292a60\") " pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.345534 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/45f52c16-f526-4498-bc85-2aec3b292a60-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lfg5l\" (UID: \"45f52c16-f526-4498-bc85-2aec3b292a60\") " pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.358455 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.360757 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-lpjfp"] Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.361697 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.368003 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gr5st" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.370283 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bb2q\" (UniqueName: \"kubernetes.io/projected/45f52c16-f526-4498-bc85-2aec3b292a60-kube-api-access-6bb2q\") pod \"observability-operator-59bdc8b94-lfg5l\" (UID: \"45f52c16-f526-4498-bc85-2aec3b292a60\") " pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.380492 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(5e49ba5f851bfe904ca15da549f72ba4b4589707c995e6d7f5f02334e5c46c08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.380756 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(5e49ba5f851bfe904ca15da549f72ba4b4589707c995e6d7f5f02334e5c46c08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.380774 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(5e49ba5f851bfe904ca15da549f72ba4b4589707c995e6d7f5f02334e5c46c08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.380812 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators(c9f195d7-1f66-4278-98bf-4ed7bdbb42a1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators(c9f195d7-1f66-4278-98bf-4ed7bdbb42a1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(5e49ba5f851bfe904ca15da549f72ba4b4589707c995e6d7f5f02334e5c46c08): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" podUID="c9f195d7-1f66-4278-98bf-4ed7bdbb42a1" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.390420 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.414687 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(e8fec0a286086d9838edba9c4234e67b35d57681ef0b2c39925c9818995af9b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.414808 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(e8fec0a286086d9838edba9c4234e67b35d57681ef0b2c39925c9818995af9b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.414892 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(e8fec0a286086d9838edba9c4234e67b35d57681ef0b2c39925c9818995af9b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.414987 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators(f63eb54a-2bd6-4366-a206-095360d8b368)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators(f63eb54a-2bd6-4366-a206-095360d8b368)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(e8fec0a286086d9838edba9c4234e67b35d57681ef0b2c39925c9818995af9b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" podUID="f63eb54a-2bd6-4366-a206-095360d8b368" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.442914 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b13bfd05-5a88-449b-9d26-f11acf9c6bbf-openshift-service-ca\") pod \"perses-operator-5bf474d74f-lpjfp\" (UID: \"b13bfd05-5a88-449b-9d26-f11acf9c6bbf\") " pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.443006 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlwsz\" (UniqueName: \"kubernetes.io/projected/b13bfd05-5a88-449b-9d26-f11acf9c6bbf-kube-api-access-rlwsz\") pod \"perses-operator-5bf474d74f-lpjfp\" (UID: \"b13bfd05-5a88-449b-9d26-f11acf9c6bbf\") " pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.460250 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.467170 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"b2725fca98696e370ebbe604925fa2d19f531dd6f3b19aead556068e1ecf59f0"} Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.496433 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(56d4fb7f5b6934b8781dbcee17a200b9ce13f76277921fcd32e2c0f105a5fa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.496509 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(56d4fb7f5b6934b8781dbcee17a200b9ce13f76277921fcd32e2c0f105a5fa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.496536 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(56d4fb7f5b6934b8781dbcee17a200b9ce13f76277921fcd32e2c0f105a5fa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.496589 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-lfg5l_openshift-operators(45f52c16-f526-4498-bc85-2aec3b292a60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-lfg5l_openshift-operators(45f52c16-f526-4498-bc85-2aec3b292a60)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(56d4fb7f5b6934b8781dbcee17a200b9ce13f76277921fcd32e2c0f105a5fa58): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.546337 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b13bfd05-5a88-449b-9d26-f11acf9c6bbf-openshift-service-ca\") pod \"perses-operator-5bf474d74f-lpjfp\" (UID: \"b13bfd05-5a88-449b-9d26-f11acf9c6bbf\") " pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.546432 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlwsz\" (UniqueName: \"kubernetes.io/projected/b13bfd05-5a88-449b-9d26-f11acf9c6bbf-kube-api-access-rlwsz\") pod \"perses-operator-5bf474d74f-lpjfp\" (UID: \"b13bfd05-5a88-449b-9d26-f11acf9c6bbf\") " pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.547914 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b13bfd05-5a88-449b-9d26-f11acf9c6bbf-openshift-service-ca\") pod \"perses-operator-5bf474d74f-lpjfp\" (UID: \"b13bfd05-5a88-449b-9d26-f11acf9c6bbf\") " pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.562605 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlwsz\" (UniqueName: \"kubernetes.io/projected/b13bfd05-5a88-449b-9d26-f11acf9c6bbf-kube-api-access-rlwsz\") pod \"perses-operator-5bf474d74f-lpjfp\" (UID: \"b13bfd05-5a88-449b-9d26-f11acf9c6bbf\") " pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: I0105 13:57:37.680812 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.703228 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(e47cd9b785fd3095585b73ab628042592b9ea994418fdb1aaa4e99dcd34e64e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.703310 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(e47cd9b785fd3095585b73ab628042592b9ea994418fdb1aaa4e99dcd34e64e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.703338 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(e47cd9b785fd3095585b73ab628042592b9ea994418fdb1aaa4e99dcd34e64e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:37 crc kubenswrapper[4740]: E0105 13:57:37.703389 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-lpjfp_openshift-operators(b13bfd05-5a88-449b-9d26-f11acf9c6bbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-lpjfp_openshift-operators(b13bfd05-5a88-449b-9d26-f11acf9c6bbf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(e47cd9b785fd3095585b73ab628042592b9ea994418fdb1aaa4e99dcd34e64e3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.459453 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lfg5l"] Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.459952 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.460397 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.508211 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-lpjfp"] Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.508324 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.508858 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.516035 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" event={"ID":"3b6933aa-4b90-4f86-80bb-007f3b987d77","Type":"ContainerStarted","Data":"741f4fd2b6c72f64cc462bfe266db6eeb584821ad073ec3ddf38972efddc5e31"} Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.517139 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.517175 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.517214 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.517275 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(d30991e7b213ffe17a2cd36126b6437a2ec551fa0127c954aa245fa25a2e9731): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.517305 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(d30991e7b213ffe17a2cd36126b6437a2ec551fa0127c954aa245fa25a2e9731): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.517323 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(d30991e7b213ffe17a2cd36126b6437a2ec551fa0127c954aa245fa25a2e9731): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.517357 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-lfg5l_openshift-operators(45f52c16-f526-4498-bc85-2aec3b292a60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-lfg5l_openshift-operators(45f52c16-f526-4498-bc85-2aec3b292a60)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lfg5l_openshift-operators_45f52c16-f526-4498-bc85-2aec3b292a60_0(d30991e7b213ffe17a2cd36126b6437a2ec551fa0127c954aa245fa25a2e9731): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.564524 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(3c90099610967123e3b63134d91012bc63bc9da3f1a7f01be5e1c24638b0a5e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.564816 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(3c90099610967123e3b63134d91012bc63bc9da3f1a7f01be5e1c24638b0a5e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.564835 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(3c90099610967123e3b63134d91012bc63bc9da3f1a7f01be5e1c24638b0a5e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.564875 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-lpjfp_openshift-operators(b13bfd05-5a88-449b-9d26-f11acf9c6bbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-lpjfp_openshift-operators(b13bfd05-5a88-449b-9d26-f11acf9c6bbf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-lpjfp_openshift-operators_b13bfd05-5a88-449b-9d26-f11acf9c6bbf_0(3c90099610967123e3b63134d91012bc63bc9da3f1a7f01be5e1c24638b0a5e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.597741 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.618469 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" podStartSLOduration=8.618452168 podStartE2EDuration="8.618452168s" podCreationTimestamp="2026-01-05 13:57:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 13:57:40.615577019 +0000 UTC m=+509.922485598" watchObservedRunningTime="2026-01-05 13:57:40.618452168 +0000 UTC m=+509.925360747" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.640709 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.652137 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7"] Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.652232 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.652650 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.688245 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4"] Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.688380 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.688941 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.696905 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(5ab7de7ccb510176d43726dffe4ef45f3b8f011f143c0f5505c0762321ad25a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.696958 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(5ab7de7ccb510176d43726dffe4ef45f3b8f011f143c0f5505c0762321ad25a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.696977 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(5ab7de7ccb510176d43726dffe4ef45f3b8f011f143c0f5505c0762321ad25a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.697011 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators(abe8ad9d-d5fe-46ef-8220-0d45b4f077b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators(abe8ad9d-d5fe-46ef-8220-0d45b4f077b2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-9jwt7_openshift-operators_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2_0(5ab7de7ccb510176d43726dffe4ef45f3b8f011f143c0f5505c0762321ad25a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" podUID="abe8ad9d-d5fe-46ef-8220-0d45b4f077b2" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.718602 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(a53def3d682cde1c6fb8e6a540f717863f29abdef00d0c79972bfaf676044790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.718680 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(a53def3d682cde1c6fb8e6a540f717863f29abdef00d0c79972bfaf676044790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.718702 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(a53def3d682cde1c6fb8e6a540f717863f29abdef00d0c79972bfaf676044790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.718741 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators(c9f195d7-1f66-4278-98bf-4ed7bdbb42a1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators(c9f195d7-1f66-4278-98bf-4ed7bdbb42a1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_openshift-operators_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1_0(a53def3d682cde1c6fb8e6a540f717863f29abdef00d0c79972bfaf676044790): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" podUID="c9f195d7-1f66-4278-98bf-4ed7bdbb42a1" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.719369 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw"] Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.719472 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:40 crc kubenswrapper[4740]: I0105 13:57:40.719888 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.740633 4740 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(6b69491024c35b7479b845228b9eabe9cd17a8038f9643332e769817d9865d62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.740682 4740 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(6b69491024c35b7479b845228b9eabe9cd17a8038f9643332e769817d9865d62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.740701 4740 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(6b69491024c35b7479b845228b9eabe9cd17a8038f9643332e769817d9865d62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:40 crc kubenswrapper[4740]: E0105 13:57:40.740739 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators(f63eb54a-2bd6-4366-a206-095360d8b368)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators(f63eb54a-2bd6-4366-a206-095360d8b368)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_openshift-operators_f63eb54a-2bd6-4366-a206-095360d8b368_0(6b69491024c35b7479b845228b9eabe9cd17a8038f9643332e769817d9865d62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" podUID="f63eb54a-2bd6-4366-a206-095360d8b368" Jan 05 13:57:50 crc kubenswrapper[4740]: I0105 13:57:50.967514 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:50 crc kubenswrapper[4740]: I0105 13:57:50.971543 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:51 crc kubenswrapper[4740]: I0105 13:57:51.222402 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lfg5l"] Jan 05 13:57:51 crc kubenswrapper[4740]: I0105 13:57:51.588247 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" event={"ID":"45f52c16-f526-4498-bc85-2aec3b292a60","Type":"ContainerStarted","Data":"217c11cd79b879e2e22f2378e1ad3631fe89fe0aa961b8ff6f5dba88a8c4ce88"} Jan 05 13:57:54 crc kubenswrapper[4740]: I0105 13:57:54.967322 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:54 crc kubenswrapper[4740]: I0105 13:57:54.968424 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" Jan 05 13:57:54 crc kubenswrapper[4740]: I0105 13:57:54.969003 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:54 crc kubenswrapper[4740]: I0105 13:57:54.969464 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" Jan 05 13:57:54 crc kubenswrapper[4740]: I0105 13:57:54.970059 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:54 crc kubenswrapper[4740]: I0105 13:57:54.970972 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:57:55 crc kubenswrapper[4740]: I0105 13:57:55.967146 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:55 crc kubenswrapper[4740]: I0105 13:57:55.968533 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.451553 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-lpjfp"] Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.498801 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw"] Jan 05 13:57:59 crc kubenswrapper[4740]: W0105 13:57:59.504041 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63eb54a_2bd6_4366_a206_095360d8b368.slice/crio-9c38f69363f37156aebbb53d828bbfc0f2b8a99575e5ba88117b73630b645871 WatchSource:0}: Error finding container 9c38f69363f37156aebbb53d828bbfc0f2b8a99575e5ba88117b73630b645871: Status 404 returned error can't find the container with id 9c38f69363f37156aebbb53d828bbfc0f2b8a99575e5ba88117b73630b645871 Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.555046 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4"] Jan 05 13:57:59 crc kubenswrapper[4740]: W0105 13:57:59.561422 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f195d7_1f66_4278_98bf_4ed7bdbb42a1.slice/crio-a035e72276a934a5b6fffae06ff2bf52a9e20b0ee33f72feb01f6861f2ecc9aa WatchSource:0}: Error finding container a035e72276a934a5b6fffae06ff2bf52a9e20b0ee33f72feb01f6861f2ecc9aa: Status 404 returned error can't find the container with id a035e72276a934a5b6fffae06ff2bf52a9e20b0ee33f72feb01f6861f2ecc9aa Jan 05 13:57:59 crc kubenswrapper[4740]: W0105 13:57:59.565806 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe8ad9d_d5fe_46ef_8220_0d45b4f077b2.slice/crio-d6f278e103ebce31f71bb535839a9ca0579dd23915f7fdba2062d2add867277e WatchSource:0}: Error finding container d6f278e103ebce31f71bb535839a9ca0579dd23915f7fdba2062d2add867277e: Status 404 returned error can't find the container with id d6f278e103ebce31f71bb535839a9ca0579dd23915f7fdba2062d2add867277e Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.566907 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7"] Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.654897 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" event={"ID":"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1","Type":"ContainerStarted","Data":"a035e72276a934a5b6fffae06ff2bf52a9e20b0ee33f72feb01f6861f2ecc9aa"} Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.656021 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" event={"ID":"b13bfd05-5a88-449b-9d26-f11acf9c6bbf","Type":"ContainerStarted","Data":"cf5a693f057a72fbf28c068b274cc62267323a7a5246bc7205a418364f35163b"} Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.657108 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" event={"ID":"45f52c16-f526-4498-bc85-2aec3b292a60","Type":"ContainerStarted","Data":"8dcb6b31199d5e17c148ecc9fd84477e0e7abb7063ee9d6f58e3390f9189f735"} Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.658197 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.663122 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.663678 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" event={"ID":"f63eb54a-2bd6-4366-a206-095360d8b368","Type":"ContainerStarted","Data":"9c38f69363f37156aebbb53d828bbfc0f2b8a99575e5ba88117b73630b645871"} Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.665151 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" event={"ID":"abe8ad9d-d5fe-46ef-8220-0d45b4f077b2","Type":"ContainerStarted","Data":"d6f278e103ebce31f71bb535839a9ca0579dd23915f7fdba2062d2add867277e"} Jan 05 13:57:59 crc kubenswrapper[4740]: I0105 13:57:59.686345 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podStartSLOduration=14.75675511 podStartE2EDuration="22.686330313s" podCreationTimestamp="2026-01-05 13:57:37 +0000 UTC" firstStartedPulling="2026-01-05 13:57:51.235180839 +0000 UTC m=+520.542089418" lastFinishedPulling="2026-01-05 13:57:59.164756042 +0000 UTC m=+528.471664621" observedRunningTime="2026-01-05 13:57:59.681862632 +0000 UTC m=+528.988771211" watchObservedRunningTime="2026-01-05 13:57:59.686330313 +0000 UTC m=+528.993238892" Jan 05 13:58:03 crc kubenswrapper[4740]: I0105 13:58:03.024919 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwwz8" Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.731601 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" event={"ID":"c9f195d7-1f66-4278-98bf-4ed7bdbb42a1","Type":"ContainerStarted","Data":"b7d605a270b9759a3c3c461fe96419f8bf85dae90105d3118aa161fede7bd5c8"} Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.733870 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" event={"ID":"b13bfd05-5a88-449b-9d26-f11acf9c6bbf","Type":"ContainerStarted","Data":"52cb6db0b93c7bfaf2fe198741305700dbcfadc3d99046eb8d19399a52874795"} Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.734051 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.736123 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" event={"ID":"f63eb54a-2bd6-4366-a206-095360d8b368","Type":"ContainerStarted","Data":"fa08459de91c8e794957ea03d42d081a0c6aa44ae9904393da3311c709669941"} Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.738318 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" event={"ID":"abe8ad9d-d5fe-46ef-8220-0d45b4f077b2","Type":"ContainerStarted","Data":"cdeb4968cf810fd0bcd26c5455eed5d35b254b6837342f65ae64d1cd53404b9f"} Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.753512 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4" podStartSLOduration=23.781540379 podStartE2EDuration="27.753490435s" podCreationTimestamp="2026-01-05 13:57:37 +0000 UTC" firstStartedPulling="2026-01-05 13:57:59.563402535 +0000 UTC m=+528.870311124" lastFinishedPulling="2026-01-05 13:58:03.535352561 +0000 UTC m=+532.842261180" observedRunningTime="2026-01-05 13:58:04.75073984 +0000 UTC m=+534.057648459" watchObservedRunningTime="2026-01-05 13:58:04.753490435 +0000 UTC m=+534.060399054" Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.792807 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9jwt7" podStartSLOduration=24.822731719 podStartE2EDuration="28.792784802s" podCreationTimestamp="2026-01-05 13:57:36 +0000 UTC" firstStartedPulling="2026-01-05 13:57:59.570440707 +0000 UTC m=+528.877349306" lastFinishedPulling="2026-01-05 13:58:03.54049377 +0000 UTC m=+532.847402389" observedRunningTime="2026-01-05 13:58:04.786972574 +0000 UTC m=+534.093881193" watchObservedRunningTime="2026-01-05 13:58:04.792784802 +0000 UTC m=+534.099693421" Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.818945 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podStartSLOduration=23.729997799 podStartE2EDuration="27.818923411s" podCreationTimestamp="2026-01-05 13:57:37 +0000 UTC" firstStartedPulling="2026-01-05 13:57:59.462380522 +0000 UTC m=+528.769289101" lastFinishedPulling="2026-01-05 13:58:03.551306094 +0000 UTC m=+532.858214713" observedRunningTime="2026-01-05 13:58:04.81662635 +0000 UTC m=+534.123534959" watchObservedRunningTime="2026-01-05 13:58:04.818923411 +0000 UTC m=+534.125832020" Jan 05 13:58:04 crc kubenswrapper[4740]: I0105 13:58:04.847250 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw" podStartSLOduration=23.8231466 podStartE2EDuration="27.847230881s" podCreationTimestamp="2026-01-05 13:57:37 +0000 UTC" firstStartedPulling="2026-01-05 13:57:59.506763548 +0000 UTC m=+528.813672127" lastFinishedPulling="2026-01-05 13:58:03.530847789 +0000 UTC m=+532.837756408" observedRunningTime="2026-01-05 13:58:04.844302381 +0000 UTC m=+534.151210980" watchObservedRunningTime="2026-01-05 13:58:04.847230881 +0000 UTC m=+534.154139450" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.252508 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d"] Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.254103 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.257217 4740 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6q4pp" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.257596 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.257922 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.274128 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d"] Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.287162 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-swj9b"] Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.288235 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-swj9b" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.292863 4740 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-flsfl" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.294344 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zc946"] Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.295138 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.296609 4740 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-r72r5" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.305633 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-swj9b"] Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.310268 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zc946"] Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.402673 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xnww\" (UniqueName: \"kubernetes.io/projected/237e57c0-4e0a-4c9f-89a2-e5a84fb41d22-kube-api-access-4xnww\") pod \"cert-manager-webhook-687f57d79b-zc946\" (UID: \"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.402727 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvrz\" (UniqueName: \"kubernetes.io/projected/de715cf7-9b2d-4980-96da-f3b7d489b142-kube-api-access-bhvrz\") pod \"cert-manager-cainjector-cf98fcc89-gpw7d\" (UID: \"de715cf7-9b2d-4980-96da-f3b7d489b142\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.402945 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tc8b\" (UniqueName: \"kubernetes.io/projected/bd3cc742-4514-481b-8eef-dc428aae4320-kube-api-access-9tc8b\") pod \"cert-manager-858654f9db-swj9b\" (UID: \"bd3cc742-4514-481b-8eef-dc428aae4320\") " pod="cert-manager/cert-manager-858654f9db-swj9b" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.504869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tc8b\" (UniqueName: \"kubernetes.io/projected/bd3cc742-4514-481b-8eef-dc428aae4320-kube-api-access-9tc8b\") pod \"cert-manager-858654f9db-swj9b\" (UID: \"bd3cc742-4514-481b-8eef-dc428aae4320\") " pod="cert-manager/cert-manager-858654f9db-swj9b" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.504977 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xnww\" (UniqueName: \"kubernetes.io/projected/237e57c0-4e0a-4c9f-89a2-e5a84fb41d22-kube-api-access-4xnww\") pod \"cert-manager-webhook-687f57d79b-zc946\" (UID: \"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.505029 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvrz\" (UniqueName: \"kubernetes.io/projected/de715cf7-9b2d-4980-96da-f3b7d489b142-kube-api-access-bhvrz\") pod \"cert-manager-cainjector-cf98fcc89-gpw7d\" (UID: \"de715cf7-9b2d-4980-96da-f3b7d489b142\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.523722 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tc8b\" (UniqueName: \"kubernetes.io/projected/bd3cc742-4514-481b-8eef-dc428aae4320-kube-api-access-9tc8b\") pod \"cert-manager-858654f9db-swj9b\" (UID: \"bd3cc742-4514-481b-8eef-dc428aae4320\") " pod="cert-manager/cert-manager-858654f9db-swj9b" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.523774 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xnww\" (UniqueName: \"kubernetes.io/projected/237e57c0-4e0a-4c9f-89a2-e5a84fb41d22-kube-api-access-4xnww\") pod \"cert-manager-webhook-687f57d79b-zc946\" (UID: \"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.524165 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvrz\" (UniqueName: \"kubernetes.io/projected/de715cf7-9b2d-4980-96da-f3b7d489b142-kube-api-access-bhvrz\") pod \"cert-manager-cainjector-cf98fcc89-gpw7d\" (UID: \"de715cf7-9b2d-4980-96da-f3b7d489b142\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.574645 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.606633 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-swj9b" Jan 05 13:58:09 crc kubenswrapper[4740]: I0105 13:58:09.614921 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:10 crc kubenswrapper[4740]: I0105 13:58:10.033049 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d"] Jan 05 13:58:10 crc kubenswrapper[4740]: W0105 13:58:10.033757 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde715cf7_9b2d_4980_96da_f3b7d489b142.slice/crio-03668e61171413921ae37f6a5a910eb60d42dbcc7976b6a7b0dfbe893ded7a6a WatchSource:0}: Error finding container 03668e61171413921ae37f6a5a910eb60d42dbcc7976b6a7b0dfbe893ded7a6a: Status 404 returned error can't find the container with id 03668e61171413921ae37f6a5a910eb60d42dbcc7976b6a7b0dfbe893ded7a6a Jan 05 13:58:10 crc kubenswrapper[4740]: W0105 13:58:10.036301 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd3cc742_4514_481b_8eef_dc428aae4320.slice/crio-a23bc442ab2b2ce723c46a455d2774b06c3d4587dca649ae579d58cf31d5cb0f WatchSource:0}: Error finding container a23bc442ab2b2ce723c46a455d2774b06c3d4587dca649ae579d58cf31d5cb0f: Status 404 returned error can't find the container with id a23bc442ab2b2ce723c46a455d2774b06c3d4587dca649ae579d58cf31d5cb0f Jan 05 13:58:10 crc kubenswrapper[4740]: I0105 13:58:10.041457 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-swj9b"] Jan 05 13:58:10 crc kubenswrapper[4740]: I0105 13:58:10.102490 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zc946"] Jan 05 13:58:10 crc kubenswrapper[4740]: W0105 13:58:10.109488 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod237e57c0_4e0a_4c9f_89a2_e5a84fb41d22.slice/crio-9fce5377a0c98d577b73a9e658430e33333ead07d0ac60491d71b9ecfb2c5fe4 WatchSource:0}: Error finding container 9fce5377a0c98d577b73a9e658430e33333ead07d0ac60491d71b9ecfb2c5fe4: Status 404 returned error can't find the container with id 9fce5377a0c98d577b73a9e658430e33333ead07d0ac60491d71b9ecfb2c5fe4 Jan 05 13:58:10 crc kubenswrapper[4740]: I0105 13:58:10.800446 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-swj9b" event={"ID":"bd3cc742-4514-481b-8eef-dc428aae4320","Type":"ContainerStarted","Data":"a23bc442ab2b2ce723c46a455d2774b06c3d4587dca649ae579d58cf31d5cb0f"} Jan 05 13:58:10 crc kubenswrapper[4740]: I0105 13:58:10.802683 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" event={"ID":"de715cf7-9b2d-4980-96da-f3b7d489b142","Type":"ContainerStarted","Data":"03668e61171413921ae37f6a5a910eb60d42dbcc7976b6a7b0dfbe893ded7a6a"} Jan 05 13:58:10 crc kubenswrapper[4740]: I0105 13:58:10.803977 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" event={"ID":"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22","Type":"ContainerStarted","Data":"9fce5377a0c98d577b73a9e658430e33333ead07d0ac60491d71b9ecfb2c5fe4"} Jan 05 13:58:14 crc kubenswrapper[4740]: I0105 13:58:14.833870 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" event={"ID":"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22","Type":"ContainerStarted","Data":"a81121b4cafa888982fe80a89360b5a2ef5f338e426058529b1b00dfba5ed46d"} Jan 05 13:58:14 crc kubenswrapper[4740]: I0105 13:58:14.835475 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:14 crc kubenswrapper[4740]: I0105 13:58:14.836363 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-swj9b" event={"ID":"bd3cc742-4514-481b-8eef-dc428aae4320","Type":"ContainerStarted","Data":"6daa55972dc19a3056d653767cdc27ea767ff779aae461c6feb5a429b2e3667d"} Jan 05 13:58:14 crc kubenswrapper[4740]: I0105 13:58:14.837525 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" event={"ID":"de715cf7-9b2d-4980-96da-f3b7d489b142","Type":"ContainerStarted","Data":"e19dec2ba0fc3ed832312e085f953aaee81bfe4c90778512c5045b57dc7274bb"} Jan 05 13:58:14 crc kubenswrapper[4740]: I0105 13:58:14.853486 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podStartSLOduration=1.6671393700000001 podStartE2EDuration="5.853465775s" podCreationTimestamp="2026-01-05 13:58:09 +0000 UTC" firstStartedPulling="2026-01-05 13:58:10.112227393 +0000 UTC m=+539.419135982" lastFinishedPulling="2026-01-05 13:58:14.298553808 +0000 UTC m=+543.605462387" observedRunningTime="2026-01-05 13:58:14.853355262 +0000 UTC m=+544.160263841" watchObservedRunningTime="2026-01-05 13:58:14.853465775 +0000 UTC m=+544.160374354" Jan 05 13:58:14 crc kubenswrapper[4740]: I0105 13:58:14.890577 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gpw7d" podStartSLOduration=1.552412585 podStartE2EDuration="5.890560983s" podCreationTimestamp="2026-01-05 13:58:09 +0000 UTC" firstStartedPulling="2026-01-05 13:58:10.036484827 +0000 UTC m=+539.343393446" lastFinishedPulling="2026-01-05 13:58:14.374633255 +0000 UTC m=+543.681541844" observedRunningTime="2026-01-05 13:58:14.872115592 +0000 UTC m=+544.179024181" watchObservedRunningTime="2026-01-05 13:58:14.890560983 +0000 UTC m=+544.197469562" Jan 05 13:58:17 crc kubenswrapper[4740]: I0105 13:58:17.683315 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 13:58:17 crc kubenswrapper[4740]: I0105 13:58:17.703241 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-swj9b" podStartSLOduration=4.433905103 podStartE2EDuration="8.703225561s" podCreationTimestamp="2026-01-05 13:58:09 +0000 UTC" firstStartedPulling="2026-01-05 13:58:10.038900863 +0000 UTC m=+539.345809442" lastFinishedPulling="2026-01-05 13:58:14.308221311 +0000 UTC m=+543.615129900" observedRunningTime="2026-01-05 13:58:14.888681132 +0000 UTC m=+544.195589711" watchObservedRunningTime="2026-01-05 13:58:17.703225561 +0000 UTC m=+547.010134140" Jan 05 13:58:19 crc kubenswrapper[4740]: I0105 13:58:19.622489 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.417300 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh"] Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.420102 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.422040 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.434183 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh"] Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.518598 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.518679 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.518715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8dcr\" (UniqueName: \"kubernetes.io/projected/912ddca7-b1d4-4430-8b31-36519503d33e-kube-api-access-m8dcr\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.619733 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.620007 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8dcr\" (UniqueName: \"kubernetes.io/projected/912ddca7-b1d4-4430-8b31-36519503d33e-kube-api-access-m8dcr\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.620140 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.620829 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.620879 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.649386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8dcr\" (UniqueName: \"kubernetes.io/projected/912ddca7-b1d4-4430-8b31-36519503d33e-kube-api-access-m8dcr\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.759502 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.813128 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4"] Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.814304 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.850072 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4"] Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.925584 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.925916 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:44 crc kubenswrapper[4740]: I0105 13:58:44.926089 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8srxw\" (UniqueName: \"kubernetes.io/projected/e82c700b-18bd-40e7-a546-67dcf6484bc9-kube-api-access-8srxw\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.027428 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8srxw\" (UniqueName: \"kubernetes.io/projected/e82c700b-18bd-40e7-a546-67dcf6484bc9-kube-api-access-8srxw\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.027503 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.027594 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.028002 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.028235 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.048857 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8srxw\" (UniqueName: \"kubernetes.io/projected/e82c700b-18bd-40e7-a546-67dcf6484bc9-kube-api-access-8srxw\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.153672 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.264615 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh"] Jan 05 13:58:45 crc kubenswrapper[4740]: W0105 13:58:45.271715 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod912ddca7_b1d4_4430_8b31_36519503d33e.slice/crio-02f88a68a9b8a038df5e9b19e8a28411bf37ac16ed0592f0faaebaa48cb7470b WatchSource:0}: Error finding container 02f88a68a9b8a038df5e9b19e8a28411bf37ac16ed0592f0faaebaa48cb7470b: Status 404 returned error can't find the container with id 02f88a68a9b8a038df5e9b19e8a28411bf37ac16ed0592f0faaebaa48cb7470b Jan 05 13:58:45 crc kubenswrapper[4740]: I0105 13:58:45.396676 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4"] Jan 05 13:58:45 crc kubenswrapper[4740]: W0105 13:58:45.403188 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode82c700b_18bd_40e7_a546_67dcf6484bc9.slice/crio-5df2fe099140107939f1bd8948d5e4ee07b362d429a7c0cdd08034610c6a280e WatchSource:0}: Error finding container 5df2fe099140107939f1bd8948d5e4ee07b362d429a7c0cdd08034610c6a280e: Status 404 returned error can't find the container with id 5df2fe099140107939f1bd8948d5e4ee07b362d429a7c0cdd08034610c6a280e Jan 05 13:58:46 crc kubenswrapper[4740]: I0105 13:58:46.192601 4740 generic.go:334] "Generic (PLEG): container finished" podID="912ddca7-b1d4-4430-8b31-36519503d33e" containerID="b40436963302132263e2a02cacbbbcc769edd377d3e726b77fba8a48dfba6bb0" exitCode=0 Jan 05 13:58:46 crc kubenswrapper[4740]: I0105 13:58:46.192735 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" event={"ID":"912ddca7-b1d4-4430-8b31-36519503d33e","Type":"ContainerDied","Data":"b40436963302132263e2a02cacbbbcc769edd377d3e726b77fba8a48dfba6bb0"} Jan 05 13:58:46 crc kubenswrapper[4740]: I0105 13:58:46.192821 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" event={"ID":"912ddca7-b1d4-4430-8b31-36519503d33e","Type":"ContainerStarted","Data":"02f88a68a9b8a038df5e9b19e8a28411bf37ac16ed0592f0faaebaa48cb7470b"} Jan 05 13:58:46 crc kubenswrapper[4740]: I0105 13:58:46.197873 4740 generic.go:334] "Generic (PLEG): container finished" podID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerID="f2e42c4d107a68b3a6cbdedee589916177a843aee1cba44ba4ee79b5f42cfd2c" exitCode=0 Jan 05 13:58:46 crc kubenswrapper[4740]: I0105 13:58:46.197930 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" event={"ID":"e82c700b-18bd-40e7-a546-67dcf6484bc9","Type":"ContainerDied","Data":"f2e42c4d107a68b3a6cbdedee589916177a843aee1cba44ba4ee79b5f42cfd2c"} Jan 05 13:58:46 crc kubenswrapper[4740]: I0105 13:58:46.197971 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" event={"ID":"e82c700b-18bd-40e7-a546-67dcf6484bc9","Type":"ContainerStarted","Data":"5df2fe099140107939f1bd8948d5e4ee07b362d429a7c0cdd08034610c6a280e"} Jan 05 13:58:47 crc kubenswrapper[4740]: I0105 13:58:47.207147 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" event={"ID":"e82c700b-18bd-40e7-a546-67dcf6484bc9","Type":"ContainerStarted","Data":"367a4c61cdab3bc0d1c14122626357f8917e08e2bbb2227c4c2e1cfecc1a4e19"} Jan 05 13:58:48 crc kubenswrapper[4740]: I0105 13:58:48.216467 4740 generic.go:334] "Generic (PLEG): container finished" podID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerID="367a4c61cdab3bc0d1c14122626357f8917e08e2bbb2227c4c2e1cfecc1a4e19" exitCode=0 Jan 05 13:58:48 crc kubenswrapper[4740]: I0105 13:58:48.216554 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" event={"ID":"e82c700b-18bd-40e7-a546-67dcf6484bc9","Type":"ContainerDied","Data":"367a4c61cdab3bc0d1c14122626357f8917e08e2bbb2227c4c2e1cfecc1a4e19"} Jan 05 13:58:49 crc kubenswrapper[4740]: I0105 13:58:49.226593 4740 generic.go:334] "Generic (PLEG): container finished" podID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerID="321395d422f147d25e6119bb450b88e59fd9ac1cf4cdc7309d9fef9598d24689" exitCode=0 Jan 05 13:58:49 crc kubenswrapper[4740]: I0105 13:58:49.226693 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" event={"ID":"e82c700b-18bd-40e7-a546-67dcf6484bc9","Type":"ContainerDied","Data":"321395d422f147d25e6119bb450b88e59fd9ac1cf4cdc7309d9fef9598d24689"} Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.583135 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.631237 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8srxw\" (UniqueName: \"kubernetes.io/projected/e82c700b-18bd-40e7-a546-67dcf6484bc9-kube-api-access-8srxw\") pod \"e82c700b-18bd-40e7-a546-67dcf6484bc9\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.631398 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-bundle\") pod \"e82c700b-18bd-40e7-a546-67dcf6484bc9\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.633823 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-bundle" (OuterVolumeSpecName: "bundle") pod "e82c700b-18bd-40e7-a546-67dcf6484bc9" (UID: "e82c700b-18bd-40e7-a546-67dcf6484bc9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.634026 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-util\") pod \"e82c700b-18bd-40e7-a546-67dcf6484bc9\" (UID: \"e82c700b-18bd-40e7-a546-67dcf6484bc9\") " Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.641725 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.642272 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82c700b-18bd-40e7-a546-67dcf6484bc9-kube-api-access-8srxw" (OuterVolumeSpecName: "kube-api-access-8srxw") pod "e82c700b-18bd-40e7-a546-67dcf6484bc9" (UID: "e82c700b-18bd-40e7-a546-67dcf6484bc9"). InnerVolumeSpecName "kube-api-access-8srxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.667998 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-util" (OuterVolumeSpecName: "util") pod "e82c700b-18bd-40e7-a546-67dcf6484bc9" (UID: "e82c700b-18bd-40e7-a546-67dcf6484bc9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.743208 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e82c700b-18bd-40e7-a546-67dcf6484bc9-util\") on node \"crc\" DevicePath \"\"" Jan 05 13:58:50 crc kubenswrapper[4740]: I0105 13:58:50.743244 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8srxw\" (UniqueName: \"kubernetes.io/projected/e82c700b-18bd-40e7-a546-67dcf6484bc9-kube-api-access-8srxw\") on node \"crc\" DevicePath \"\"" Jan 05 13:58:51 crc kubenswrapper[4740]: I0105 13:58:51.247594 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" event={"ID":"e82c700b-18bd-40e7-a546-67dcf6484bc9","Type":"ContainerDied","Data":"5df2fe099140107939f1bd8948d5e4ee07b362d429a7c0cdd08034610c6a280e"} Jan 05 13:58:51 crc kubenswrapper[4740]: I0105 13:58:51.247645 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5df2fe099140107939f1bd8948d5e4ee07b362d429a7c0cdd08034610c6a280e" Jan 05 13:58:51 crc kubenswrapper[4740]: I0105 13:58:51.247652 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.127752 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj"] Jan 05 13:58:58 crc kubenswrapper[4740]: E0105 13:58:58.128590 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="extract" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.128604 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="extract" Jan 05 13:58:58 crc kubenswrapper[4740]: E0105 13:58:58.128628 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="pull" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.128634 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="pull" Jan 05 13:58:58 crc kubenswrapper[4740]: E0105 13:58:58.128648 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="util" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.128656 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="util" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.128775 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82c700b-18bd-40e7-a546-67dcf6484bc9" containerName="extract" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.129287 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.135563 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-7cgzb" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.135772 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.136342 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.147281 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj"] Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.281985 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh7p\" (UniqueName: \"kubernetes.io/projected/018b2164-91a6-498e-bcfb-481d36067d93-kube-api-access-xwh7p\") pod \"cluster-logging-operator-79cf69ddc8-mzrfj\" (UID: \"018b2164-91a6-498e-bcfb-481d36067d93\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.383135 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh7p\" (UniqueName: \"kubernetes.io/projected/018b2164-91a6-498e-bcfb-481d36067d93-kube-api-access-xwh7p\") pod \"cluster-logging-operator-79cf69ddc8-mzrfj\" (UID: \"018b2164-91a6-498e-bcfb-481d36067d93\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.407638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh7p\" (UniqueName: \"kubernetes.io/projected/018b2164-91a6-498e-bcfb-481d36067d93-kube-api-access-xwh7p\") pod \"cluster-logging-operator-79cf69ddc8-mzrfj\" (UID: \"018b2164-91a6-498e-bcfb-481d36067d93\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.494620 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" Jan 05 13:58:58 crc kubenswrapper[4740]: I0105 13:58:58.724046 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj"] Jan 05 13:58:59 crc kubenswrapper[4740]: I0105 13:58:59.314714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" event={"ID":"018b2164-91a6-498e-bcfb-481d36067d93","Type":"ContainerStarted","Data":"9acf99b5df5748c504671116ed632621260c21247e843a126e97ebca8922af51"} Jan 05 13:59:01 crc kubenswrapper[4740]: I0105 13:59:01.915749 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:59:01 crc kubenswrapper[4740]: I0105 13:59:01.916363 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:59:05 crc kubenswrapper[4740]: I0105 13:59:05.364168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" event={"ID":"018b2164-91a6-498e-bcfb-481d36067d93","Type":"ContainerStarted","Data":"e9c341dce2e9ef97ffd86a01f72fb20ab56e2e05178d92b3bd889ebead4e5373"} Jan 05 13:59:05 crc kubenswrapper[4740]: I0105 13:59:05.366613 4740 generic.go:334] "Generic (PLEG): container finished" podID="912ddca7-b1d4-4430-8b31-36519503d33e" containerID="826f93a59a6308a1dfff9bb64abf99bf527279fbbd35f0725a57d52f04d6ccff" exitCode=0 Jan 05 13:59:05 crc kubenswrapper[4740]: I0105 13:59:05.366667 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" event={"ID":"912ddca7-b1d4-4430-8b31-36519503d33e","Type":"ContainerDied","Data":"826f93a59a6308a1dfff9bb64abf99bf527279fbbd35f0725a57d52f04d6ccff"} Jan 05 13:59:05 crc kubenswrapper[4740]: I0105 13:59:05.395795 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-mzrfj" podStartSLOduration=1.636732792 podStartE2EDuration="7.39576559s" podCreationTimestamp="2026-01-05 13:58:58 +0000 UTC" firstStartedPulling="2026-01-05 13:58:58.747626702 +0000 UTC m=+588.054535321" lastFinishedPulling="2026-01-05 13:59:04.50665954 +0000 UTC m=+593.813568119" observedRunningTime="2026-01-05 13:59:05.386555443 +0000 UTC m=+594.693464052" watchObservedRunningTime="2026-01-05 13:59:05.39576559 +0000 UTC m=+594.702674199" Jan 05 13:59:06 crc kubenswrapper[4740]: I0105 13:59:06.375701 4740 generic.go:334] "Generic (PLEG): container finished" podID="912ddca7-b1d4-4430-8b31-36519503d33e" containerID="bb6b19ed4c227adcd84e013b17b79768afe7988b22580f64150fdb571601c20c" exitCode=0 Jan 05 13:59:06 crc kubenswrapper[4740]: I0105 13:59:06.375819 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" event={"ID":"912ddca7-b1d4-4430-8b31-36519503d33e","Type":"ContainerDied","Data":"bb6b19ed4c227adcd84e013b17b79768afe7988b22580f64150fdb571601c20c"} Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.684572 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.773391 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8dcr\" (UniqueName: \"kubernetes.io/projected/912ddca7-b1d4-4430-8b31-36519503d33e-kube-api-access-m8dcr\") pod \"912ddca7-b1d4-4430-8b31-36519503d33e\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.773479 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-bundle\") pod \"912ddca7-b1d4-4430-8b31-36519503d33e\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.773523 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-util\") pod \"912ddca7-b1d4-4430-8b31-36519503d33e\" (UID: \"912ddca7-b1d4-4430-8b31-36519503d33e\") " Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.780533 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-bundle" (OuterVolumeSpecName: "bundle") pod "912ddca7-b1d4-4430-8b31-36519503d33e" (UID: "912ddca7-b1d4-4430-8b31-36519503d33e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.783042 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-util" (OuterVolumeSpecName: "util") pod "912ddca7-b1d4-4430-8b31-36519503d33e" (UID: "912ddca7-b1d4-4430-8b31-36519503d33e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.786288 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/912ddca7-b1d4-4430-8b31-36519503d33e-kube-api-access-m8dcr" (OuterVolumeSpecName: "kube-api-access-m8dcr") pod "912ddca7-b1d4-4430-8b31-36519503d33e" (UID: "912ddca7-b1d4-4430-8b31-36519503d33e"). InnerVolumeSpecName "kube-api-access-m8dcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.874354 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8dcr\" (UniqueName: \"kubernetes.io/projected/912ddca7-b1d4-4430-8b31-36519503d33e-kube-api-access-m8dcr\") on node \"crc\" DevicePath \"\"" Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.874387 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 13:59:07 crc kubenswrapper[4740]: I0105 13:59:07.874400 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/912ddca7-b1d4-4430-8b31-36519503d33e-util\") on node \"crc\" DevicePath \"\"" Jan 05 13:59:08 crc kubenswrapper[4740]: I0105 13:59:08.393489 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" event={"ID":"912ddca7-b1d4-4430-8b31-36519503d33e","Type":"ContainerDied","Data":"02f88a68a9b8a038df5e9b19e8a28411bf37ac16ed0592f0faaebaa48cb7470b"} Jan 05 13:59:08 crc kubenswrapper[4740]: I0105 13:59:08.393554 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02f88a68a9b8a038df5e9b19e8a28411bf37ac16ed0592f0faaebaa48cb7470b" Jan 05 13:59:08 crc kubenswrapper[4740]: I0105 13:59:08.393595 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.068745 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh"] Jan 05 13:59:19 crc kubenswrapper[4740]: E0105 13:59:19.069275 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="pull" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.069288 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="pull" Jan 05 13:59:19 crc kubenswrapper[4740]: E0105 13:59:19.069299 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="util" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.069305 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="util" Jan 05 13:59:19 crc kubenswrapper[4740]: E0105 13:59:19.069322 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="extract" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.069328 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="extract" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.069466 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="912ddca7-b1d4-4430-8b31-36519503d33e" containerName="extract" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.070309 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.072499 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.072794 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-szqpp" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.072626 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.072857 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.072729 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.073521 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.090106 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh"] Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.161698 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.161747 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-manager-config\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.161776 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg7tc\" (UniqueName: \"kubernetes.io/projected/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-kube-api-access-xg7tc\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.161796 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-apiservice-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.161853 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-webhook-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.263462 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.263728 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-manager-config\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.263810 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg7tc\" (UniqueName: \"kubernetes.io/projected/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-kube-api-access-xg7tc\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.263896 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-apiservice-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.264008 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-webhook-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.265303 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-manager-config\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.269815 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-apiservice-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.269862 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-webhook-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.270382 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.287005 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg7tc\" (UniqueName: \"kubernetes.io/projected/50147f9c-3a52-4e0e-b0cc-1fd94e7def10-kube-api-access-xg7tc\") pod \"loki-operator-controller-manager-56d45b676b-q44gh\" (UID: \"50147f9c-3a52-4e0e-b0cc-1fd94e7def10\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.387470 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:19 crc kubenswrapper[4740]: I0105 13:59:19.677392 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh"] Jan 05 13:59:19 crc kubenswrapper[4740]: W0105 13:59:19.681261 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50147f9c_3a52_4e0e_b0cc_1fd94e7def10.slice/crio-a056a47100111c43318cbb743468cd86f2f976065e8baad8588d1c19a1c2d6be WatchSource:0}: Error finding container a056a47100111c43318cbb743468cd86f2f976065e8baad8588d1c19a1c2d6be: Status 404 returned error can't find the container with id a056a47100111c43318cbb743468cd86f2f976065e8baad8588d1c19a1c2d6be Jan 05 13:59:20 crc kubenswrapper[4740]: I0105 13:59:20.501661 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" event={"ID":"50147f9c-3a52-4e0e-b0cc-1fd94e7def10","Type":"ContainerStarted","Data":"a056a47100111c43318cbb743468cd86f2f976065e8baad8588d1c19a1c2d6be"} Jan 05 13:59:25 crc kubenswrapper[4740]: I0105 13:59:25.547271 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" event={"ID":"50147f9c-3a52-4e0e-b0cc-1fd94e7def10","Type":"ContainerStarted","Data":"ede59bfc0d35a45fc093143ecbb937450674fb8acdfe859a924be5f23761f648"} Jan 05 13:59:31 crc kubenswrapper[4740]: I0105 13:59:31.589314 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" event={"ID":"50147f9c-3a52-4e0e-b0cc-1fd94e7def10","Type":"ContainerStarted","Data":"c130846727208c181c237599b26daa4eb794c806013a474293ec047fa26c0994"} Jan 05 13:59:31 crc kubenswrapper[4740]: I0105 13:59:31.590130 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:31 crc kubenswrapper[4740]: I0105 13:59:31.592206 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 13:59:31 crc kubenswrapper[4740]: I0105 13:59:31.621454 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" podStartSLOduration=1.144890885 podStartE2EDuration="12.621437521s" podCreationTimestamp="2026-01-05 13:59:19 +0000 UTC" firstStartedPulling="2026-01-05 13:59:19.683990238 +0000 UTC m=+608.990898817" lastFinishedPulling="2026-01-05 13:59:31.160536864 +0000 UTC m=+620.467445453" observedRunningTime="2026-01-05 13:59:31.615632595 +0000 UTC m=+620.922541194" watchObservedRunningTime="2026-01-05 13:59:31.621437521 +0000 UTC m=+620.928346100" Jan 05 13:59:31 crc kubenswrapper[4740]: I0105 13:59:31.915598 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 13:59:31 crc kubenswrapper[4740]: I0105 13:59:31.915665 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.833221 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.835163 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.837673 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.840208 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.843974 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.957184 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") " pod="minio-dev/minio" Jan 05 13:59:36 crc kubenswrapper[4740]: I0105 13:59:36.957225 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk7p7\" (UniqueName: \"kubernetes.io/projected/a9586400-82ed-4186-84c4-766b9dea8870-kube-api-access-mk7p7\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") " pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.058709 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") " pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.058752 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk7p7\" (UniqueName: \"kubernetes.io/projected/a9586400-82ed-4186-84c4-766b9dea8870-kube-api-access-mk7p7\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") " pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.061960 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.062163 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/98659a5e563eb107ce21da2387ab5690dd0f55c464a9c5072c46597a2aa07e31/globalmount\"" pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.080676 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk7p7\" (UniqueName: \"kubernetes.io/projected/a9586400-82ed-4186-84c4-766b9dea8870-kube-api-access-mk7p7\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") " pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.102006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1f7b8cc-c4a5-4765-9ab0-22a4a01e03f2\") pod \"minio\" (UID: \"a9586400-82ed-4186-84c4-766b9dea8870\") " pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.204781 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 05 13:59:37 crc kubenswrapper[4740]: I0105 13:59:37.676740 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 05 13:59:38 crc kubenswrapper[4740]: I0105 13:59:38.654334 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a9586400-82ed-4186-84c4-766b9dea8870","Type":"ContainerStarted","Data":"2e8b1e3a23dfd7cec8bbc55e9d6bc24e31f2f11aba448744612cc545129e3419"} Jan 05 13:59:41 crc kubenswrapper[4740]: I0105 13:59:41.691326 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"a9586400-82ed-4186-84c4-766b9dea8870","Type":"ContainerStarted","Data":"69465445a188090c2cba91ecc837fb68d2380f075a1811fc878174df8dbc67b1"} Jan 05 13:59:41 crc kubenswrapper[4740]: I0105 13:59:41.719351 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.659940378 podStartE2EDuration="8.719324482s" podCreationTimestamp="2026-01-05 13:59:33 +0000 UTC" firstStartedPulling="2026-01-05 13:59:37.711424728 +0000 UTC m=+627.018333347" lastFinishedPulling="2026-01-05 13:59:40.770808852 +0000 UTC m=+630.077717451" observedRunningTime="2026-01-05 13:59:41.710875796 +0000 UTC m=+631.017784405" watchObservedRunningTime="2026-01-05 13:59:41.719324482 +0000 UTC m=+631.026233091" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.579672 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.588804 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.593576 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.593670 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-fzcpq" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.593747 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.594235 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.596242 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.596579 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.643626 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.643718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.643865 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.644016 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7927742d-54ad-4fbb-841a-71d40648d88e-config\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.644128 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbjwn\" (UniqueName: \"kubernetes.io/projected/7927742d-54ad-4fbb-841a-71d40648d88e-kube-api-access-hbjwn\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.739906 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-n78fw"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.740724 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.745308 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.745520 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.745804 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.747375 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbjwn\" (UniqueName: \"kubernetes.io/projected/7927742d-54ad-4fbb-841a-71d40648d88e-kube-api-access-hbjwn\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.747425 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.747488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.747504 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.747534 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7927742d-54ad-4fbb-841a-71d40648d88e-config\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.747994 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-n78fw"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.748386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.748418 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7927742d-54ad-4fbb-841a-71d40648d88e-config\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.753691 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.753747 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7927742d-54ad-4fbb-841a-71d40648d88e-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.763529 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbjwn\" (UniqueName: \"kubernetes.io/projected/7927742d-54ad-4fbb-841a-71d40648d88e-kube-api-access-hbjwn\") pod \"logging-loki-distributor-5f678c8dd6-bdcsh\" (UID: \"7927742d-54ad-4fbb-841a-71d40648d88e\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.817178 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.817998 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.821747 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.826682 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.840782 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.849480 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptz6w\" (UniqueName: \"kubernetes.io/projected/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-kube-api-access-ptz6w\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.849558 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.849599 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-config\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.849619 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.849649 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.849674 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-s3\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.897103 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.898199 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.905645 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.905736 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.905646 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.905927 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.907920 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.912559 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.913968 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.912784 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.916261 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-hvcnj" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.927247 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.942795 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p"] Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950500 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950536 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950561 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-rbac\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-config\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950609 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950631 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjh47\" (UniqueName: \"kubernetes.io/projected/fd41c01f-dff3-4b6a-ae38-8b114b384a59-kube-api-access-qjh47\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950649 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950670 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950690 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-lokistack-gateway\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-s3\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950729 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950756 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tenants\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950782 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptz6w\" (UniqueName: \"kubernetes.io/projected/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-kube-api-access-ptz6w\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950802 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950819 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950845 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950872 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrkjm\" (UniqueName: \"kubernetes.io/projected/d8fd9857-fca2-4041-9c72-3747c84b6987-kube-api-access-nrkjm\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950900 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8fd9857-fca2-4041-9c72-3747c84b6987-config\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.950928 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.952037 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.952667 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-config\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.956342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.966416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-s3\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:46 crc kubenswrapper[4740]: I0105 13:59:46.971487 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.008722 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptz6w\" (UniqueName: \"kubernetes.io/projected/5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41-kube-api-access-ptz6w\") pod \"logging-loki-querier-76788598db-n78fw\" (UID: \"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41\") " pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.051758 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrkjm\" (UniqueName: \"kubernetes.io/projected/d8fd9857-fca2-4041-9c72-3747c84b6987-kube-api-access-nrkjm\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.051890 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-lokistack-gateway\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.051970 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052053 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8fd9857-fca2-4041-9c72-3747c84b6987-config\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052261 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-rbac\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052352 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tenants\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052424 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052493 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052556 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052653 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-rbac\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052724 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052808 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjh47\" (UniqueName: \"kubernetes.io/projected/fd41c01f-dff3-4b6a-ae38-8b114b384a59-kube-api-access-qjh47\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052882 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.052955 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-lokistack-gateway\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.053025 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msq7\" (UniqueName: \"kubernetes.io/projected/1e34aa71-7c05-4606-a71a-2c5b20667ba1-kube-api-access-5msq7\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.053133 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.053221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tenants\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.053612 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.053713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.054453 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.054538 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.054802 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.055731 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8fd9857-fca2-4041-9c72-3747c84b6987-config\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.056318 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.057142 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-rbac\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.057943 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.060299 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: E0105 13:59:47.063218 4740 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Jan 05 13:59:47 crc kubenswrapper[4740]: E0105 13:59:47.063299 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tls-secret podName:fd41c01f-dff3-4b6a-ae38-8b114b384a59 nodeName:}" failed. No retries permitted until 2026-01-05 13:59:47.563281109 +0000 UTC m=+636.870189678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tls-secret") pod "logging-loki-gateway-66cd7bf4cd-8svgw" (UID: "fd41c01f-dff3-4b6a-ae38-8b114b384a59") : secret "logging-loki-gateway-http" not found Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.065687 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.070131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fd41c01f-dff3-4b6a-ae38-8b114b384a59-lokistack-gateway\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.071406 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tenants\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.078559 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d8fd9857-fca2-4041-9c72-3747c84b6987-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.088661 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrkjm\" (UniqueName: \"kubernetes.io/projected/d8fd9857-fca2-4041-9c72-3747c84b6987-kube-api-access-nrkjm\") pod \"logging-loki-query-frontend-69d9546745-lbmjc\" (UID: \"d8fd9857-fca2-4041-9c72-3747c84b6987\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.092589 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjh47\" (UniqueName: \"kubernetes.io/projected/fd41c01f-dff3-4b6a-ae38-8b114b384a59-kube-api-access-qjh47\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.097406 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.133349 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159039 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-lokistack-gateway\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159095 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159129 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-rbac\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159148 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tenants\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159166 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159191 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159233 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msq7\" (UniqueName: \"kubernetes.io/projected/1e34aa71-7c05-4606-a71a-2c5b20667ba1-kube-api-access-5msq7\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159266 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: E0105 13:59:47.159384 4740 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Jan 05 13:59:47 crc kubenswrapper[4740]: E0105 13:59:47.159433 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tls-secret podName:1e34aa71-7c05-4606-a71a-2c5b20667ba1 nodeName:}" failed. No retries permitted until 2026-01-05 13:59:47.659419043 +0000 UTC m=+636.966327622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tls-secret") pod "logging-loki-gateway-66cd7bf4cd-8vc2p" (UID: "1e34aa71-7c05-4606-a71a-2c5b20667ba1") : secret "logging-loki-gateway-http" not found Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.159928 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-lokistack-gateway\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.160324 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-rbac\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.164122 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.164651 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.169221 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tenants\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.169573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e34aa71-7c05-4606-a71a-2c5b20667ba1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.180628 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msq7\" (UniqueName: \"kubernetes.io/projected/1e34aa71-7c05-4606-a71a-2c5b20667ba1-kube-api-access-5msq7\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.446970 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.535528 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-n78fw"] Jan 05 13:59:47 crc kubenswrapper[4740]: W0105 13:59:47.539613 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c35e5ba_6f22_4fbd_b3d9_e7ebe1980d41.slice/crio-a7c2631a4366d0603bd4f2c94c43fb21986fa512208487dafdb6320197ccb33c WatchSource:0}: Error finding container a7c2631a4366d0603bd4f2c94c43fb21986fa512208487dafdb6320197ccb33c: Status 404 returned error can't find the container with id a7c2631a4366d0603bd4f2c94c43fb21986fa512208487dafdb6320197ccb33c Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.567961 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.571499 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fd41c01f-dff3-4b6a-ae38-8b114b384a59-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8svgw\" (UID: \"fd41c01f-dff3-4b6a-ae38-8b114b384a59\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.593187 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.669598 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.673829 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e34aa71-7c05-4606-a71a-2c5b20667ba1-tls-secret\") pod \"logging-loki-gateway-66cd7bf4cd-8vc2p\" (UID: \"1e34aa71-7c05-4606-a71a-2c5b20667ba1\") " pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.740741 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.741685 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.744654 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" event={"ID":"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41","Type":"ContainerStarted","Data":"a7c2631a4366d0603bd4f2c94c43fb21986fa512208487dafdb6320197ccb33c"} Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.745512 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.746792 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.763453 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" event={"ID":"7927742d-54ad-4fbb-841a-71d40648d88e","Type":"ContainerStarted","Data":"e07b59e2985b5b2ea711c7479ae2287da9ad82fc4bedee78f69ab342047e7e04"} Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.770691 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.781544 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.810337 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.812449 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.814581 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.814934 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.819999 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.874617 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.874722 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.874819 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.874888 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875015 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875170 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875236 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-config\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875309 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875345 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-config\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875419 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875498 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.875864 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6qz\" (UniqueName: \"kubernetes.io/projected/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-kube-api-access-np6qz\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.876008 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.876090 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85m5k\" (UniqueName: \"kubernetes.io/projected/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-kube-api-access-85m5k\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.876142 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.890483 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.891488 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.898372 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.898390 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.898626 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.938578 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977632 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977690 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977726 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977750 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977779 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/572c5991-5ab4-4786-9641-cc8f3ff4bd21-config\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977822 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977847 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977905 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977927 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-config\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977952 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.977973 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-config\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978103 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978164 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ggc\" (UniqueName: \"kubernetes.io/projected/572c5991-5ab4-4786-9641-cc8f3ff4bd21-kube-api-access-57ggc\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978184 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978215 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6qz\" (UniqueName: \"kubernetes.io/projected/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-kube-api-access-np6qz\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978245 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978293 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978314 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978336 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978379 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85m5k\" (UniqueName: \"kubernetes.io/projected/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-kube-api-access-85m5k\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.978419 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.980436 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-config\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.981984 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.982198 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.983265 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.983294 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8080d03bd1cd3b8e7960b699a9d6136a44e9bc84e1ac0dbb272776644477e304/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.983796 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-config\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.984177 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.984334 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.984543 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.984741 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.984778 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/03de8a8528844a827e8141cba0da263e8366ad7a04d5084cfa9747e9d6ed961a/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.985249 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.988291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.990258 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.993387 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.993433 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a0a0f399a63cf22b44c7b498c1bc4f772255053b5f0d6b37a6a73f38d41c54f3/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:47 crc kubenswrapper[4740]: I0105 13:59:47.998934 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85m5k\" (UniqueName: \"kubernetes.io/projected/8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c-kube-api-access-85m5k\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.007341 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6qz\" (UniqueName: \"kubernetes.io/projected/7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb-kube-api-access-np6qz\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.017825 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3575dd88-8679-404a-b9cc-3c5284583b9a\") pod \"logging-loki-compactor-0\" (UID: \"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c\") " pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.030638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58501ad4-9747-4134-9b58-7da29b69fd5d\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.032641 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4457b64d-e326-4f6d-b369-671bb6564cb7\") pod \"logging-loki-ingester-0\" (UID: \"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb\") " pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.079037 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw"] Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.079933 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.079973 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.080020 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ggc\" (UniqueName: \"kubernetes.io/projected/572c5991-5ab4-4786-9641-cc8f3ff4bd21-kube-api-access-57ggc\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.080050 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.080093 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.080115 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.080177 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/572c5991-5ab4-4786-9641-cc8f3ff4bd21-config\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.081486 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/572c5991-5ab4-4786-9641-cc8f3ff4bd21-config\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.081601 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.084893 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.085405 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.087598 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/572c5991-5ab4-4786-9641-cc8f3ff4bd21-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: W0105 13:59:48.093431 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd41c01f_dff3_4b6a_ae38_8b114b384a59.slice/crio-e07f2378332e8c40e4c2e2cb5f164ae7db524632cc7d85590927d602636e4466 WatchSource:0}: Error finding container e07f2378332e8c40e4c2e2cb5f164ae7db524632cc7d85590927d602636e4466: Status 404 returned error can't find the container with id e07f2378332e8c40e4c2e2cb5f164ae7db524632cc7d85590927d602636e4466 Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.093853 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.093893 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c275a296e16001212a8b22316b8ced3fc2c5c81772a71ddbecfca1b91ed05f7/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.104770 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ggc\" (UniqueName: \"kubernetes.io/projected/572c5991-5ab4-4786-9641-cc8f3ff4bd21-kube-api-access-57ggc\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.120605 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93b7ab57-fd1e-4668-96a4-3077895fe3b4\") pod \"logging-loki-index-gateway-0\" (UID: \"572c5991-5ab4-4786-9641-cc8f3ff4bd21\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.149558 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.173935 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.204640 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.360305 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p"] Jan 05 13:59:48 crc kubenswrapper[4740]: W0105 13:59:48.368782 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e34aa71_7c05_4606_a71a_2c5b20667ba1.slice/crio-e16a25986ae66b7950addbd82d0d8087d31a7c6b8eeae57ae24a5d6a3b9adbd0 WatchSource:0}: Error finding container e16a25986ae66b7950addbd82d0d8087d31a7c6b8eeae57ae24a5d6a3b9adbd0: Status 404 returned error can't find the container with id e16a25986ae66b7950addbd82d0d8087d31a7c6b8eeae57ae24a5d6a3b9adbd0 Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.474792 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.619944 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 05 13:59:48 crc kubenswrapper[4740]: W0105 13:59:48.734053 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod572c5991_5ab4_4786_9641_cc8f3ff4bd21.slice/crio-48628ba364790df3a55dd94f5144a24dfc03387e981401b10f8ef300a539fda7 WatchSource:0}: Error finding container 48628ba364790df3a55dd94f5144a24dfc03387e981401b10f8ef300a539fda7: Status 404 returned error can't find the container with id 48628ba364790df3a55dd94f5144a24dfc03387e981401b10f8ef300a539fda7 Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.736506 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.776867 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"572c5991-5ab4-4786-9641-cc8f3ff4bd21","Type":"ContainerStarted","Data":"48628ba364790df3a55dd94f5144a24dfc03387e981401b10f8ef300a539fda7"} Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.779653 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb","Type":"ContainerStarted","Data":"ee1d8f40e7d39ee379eaa1eea202ea537085c79185cbc142c2fd29d833b52fc4"} Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.781763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c","Type":"ContainerStarted","Data":"89ae29f9cd795ab04ce77cc7d166155710db867cb79bd26387c06c7de8a31a89"} Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.784295 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" event={"ID":"1e34aa71-7c05-4606-a71a-2c5b20667ba1","Type":"ContainerStarted","Data":"e16a25986ae66b7950addbd82d0d8087d31a7c6b8eeae57ae24a5d6a3b9adbd0"} Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.787914 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" event={"ID":"fd41c01f-dff3-4b6a-ae38-8b114b384a59","Type":"ContainerStarted","Data":"e07f2378332e8c40e4c2e2cb5f164ae7db524632cc7d85590927d602636e4466"} Jan 05 13:59:48 crc kubenswrapper[4740]: I0105 13:59:48.789801 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" event={"ID":"d8fd9857-fca2-4041-9c72-3747c84b6987","Type":"ContainerStarted","Data":"61e33312c4c4948e2d30d92ae29025d06880eb28afc98e5c30a2a3b380dc5fc4"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.843989 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" event={"ID":"fd41c01f-dff3-4b6a-ae38-8b114b384a59","Type":"ContainerStarted","Data":"29501ab290ba1eeba5d60c0696b9587630c679340f94b8cd0b7600821b4b8f58"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.846297 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" event={"ID":"d8fd9857-fca2-4041-9c72-3747c84b6987","Type":"ContainerStarted","Data":"e729fc4b06ca22f021ddcc891fdda68500b67824372beb2c44e4c938dfa89d48"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.846493 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.849365 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" event={"ID":"7927742d-54ad-4fbb-841a-71d40648d88e","Type":"ContainerStarted","Data":"2ca2945d300f69ee4eab10bb9909b6d8fca3c5e944529bcfad678e5de505d769"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.871823 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.872140 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" event={"ID":"5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41","Type":"ContainerStarted","Data":"cd7b3ea065873b5d61c104152511e68024a3b8bb5df0c7c32f7cb23560f5a903"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.872233 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"572c5991-5ab4-4786-9641-cc8f3ff4bd21","Type":"ContainerStarted","Data":"d436c01cb15b88c30d9afe1c08df850627c2a8a47e370d9ed04a8ed992e941cd"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.872326 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.872391 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.874177 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb","Type":"ContainerStarted","Data":"7e0071e5b85ee5ee7b14a3099aa98686f8438b92150da84c306f23fc24d3397a"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.874375 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.878262 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c","Type":"ContainerStarted","Data":"a0ab7c6c8b3e9dd2c253ce6c9e01336709b833b90252e36981c2e9ec3c92f007"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.878362 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.880492 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" event={"ID":"1e34aa71-7c05-4606-a71a-2c5b20667ba1","Type":"ContainerStarted","Data":"c6945769de697030dc4c952209533fd75ee2acbdab9e9cc2c8cecb6e99b17680"} Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.888283 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podStartSLOduration=2.968629542 podStartE2EDuration="6.888265293s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:47.814896789 +0000 UTC m=+637.121805368" lastFinishedPulling="2026-01-05 13:59:51.7345325 +0000 UTC m=+641.041441119" observedRunningTime="2026-01-05 13:59:52.88666881 +0000 UTC m=+642.193577419" watchObservedRunningTime="2026-01-05 13:59:52.888265293 +0000 UTC m=+642.195173882" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.938674 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.857447084 podStartE2EDuration="6.938648472s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:48.624015056 +0000 UTC m=+637.930923655" lastFinishedPulling="2026-01-05 13:59:51.705216464 +0000 UTC m=+641.012125043" observedRunningTime="2026-01-05 13:59:52.924710689 +0000 UTC m=+642.231619328" watchObservedRunningTime="2026-01-05 13:59:52.938648472 +0000 UTC m=+642.245557081" Jan 05 13:59:52 crc kubenswrapper[4740]: I0105 13:59:52.957773 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.97537677 podStartE2EDuration="6.957744472s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:48.736637412 +0000 UTC m=+638.043546011" lastFinishedPulling="2026-01-05 13:59:51.719005134 +0000 UTC m=+641.025913713" observedRunningTime="2026-01-05 13:59:52.951340631 +0000 UTC m=+642.258249240" watchObservedRunningTime="2026-01-05 13:59:52.957744472 +0000 UTC m=+642.264653061" Jan 05 13:59:53 crc kubenswrapper[4740]: I0105 13:59:53.023745 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podStartSLOduration=2.829760375 podStartE2EDuration="7.023720299s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:47.448791449 +0000 UTC m=+636.755700028" lastFinishedPulling="2026-01-05 13:59:51.642751373 +0000 UTC m=+640.949659952" observedRunningTime="2026-01-05 13:59:52.99313108 +0000 UTC m=+642.300039669" watchObservedRunningTime="2026-01-05 13:59:53.023720299 +0000 UTC m=+642.330628918" Jan 05 13:59:53 crc kubenswrapper[4740]: I0105 13:59:53.030326 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.742214588 podStartE2EDuration="7.030304134s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:48.480144236 +0000 UTC m=+637.787052815" lastFinishedPulling="2026-01-05 13:59:51.768233772 +0000 UTC m=+641.075142361" observedRunningTime="2026-01-05 13:59:53.022763723 +0000 UTC m=+642.329672312" watchObservedRunningTime="2026-01-05 13:59:53.030304134 +0000 UTC m=+642.337212743" Jan 05 13:59:53 crc kubenswrapper[4740]: I0105 13:59:53.050954 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podStartSLOduration=2.878458717 podStartE2EDuration="7.050931597s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:47.542074075 +0000 UTC m=+636.848982654" lastFinishedPulling="2026-01-05 13:59:51.714546955 +0000 UTC m=+641.021455534" observedRunningTime="2026-01-05 13:59:53.042966434 +0000 UTC m=+642.349875023" watchObservedRunningTime="2026-01-05 13:59:53.050931597 +0000 UTC m=+642.357840186" Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.897057 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" event={"ID":"fd41c01f-dff3-4b6a-ae38-8b114b384a59","Type":"ContainerStarted","Data":"a18be6d9f4254e4a822416c17f54c71d9ffb0c5676cbf061c8af5c364e2eef78"} Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.897453 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.897921 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.900113 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": dial tcp 10.217.0.53:8083: connect: connection refused" start-of-body= Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.900160 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": dial tcp 10.217.0.53:8083: connect: connection refused" Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.906358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" event={"ID":"1e34aa71-7c05-4606-a71a-2c5b20667ba1","Type":"ContainerStarted","Data":"0a186fb1c0bda745dd04606021145c24f96ca55317f09bdc93cbc51962de5c40"} Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.916767 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.921287 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podStartSLOduration=2.414565441 podStartE2EDuration="8.921272162s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:48.095718556 +0000 UTC m=+637.402627135" lastFinishedPulling="2026-01-05 13:59:54.602425237 +0000 UTC m=+643.909333856" observedRunningTime="2026-01-05 13:59:54.917867911 +0000 UTC m=+644.224776490" watchObservedRunningTime="2026-01-05 13:59:54.921272162 +0000 UTC m=+644.228180741" Jan 05 13:59:54 crc kubenswrapper[4740]: I0105 13:59:54.973944 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podStartSLOduration=2.719301239 podStartE2EDuration="8.973926912s" podCreationTimestamp="2026-01-05 13:59:46 +0000 UTC" firstStartedPulling="2026-01-05 13:59:48.373026009 +0000 UTC m=+637.679934588" lastFinishedPulling="2026-01-05 13:59:54.627651662 +0000 UTC m=+643.934560261" observedRunningTime="2026-01-05 13:59:54.968124456 +0000 UTC m=+644.275033035" watchObservedRunningTime="2026-01-05 13:59:54.973926912 +0000 UTC m=+644.280835491" Jan 05 13:59:55 crc kubenswrapper[4740]: I0105 13:59:55.915464 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:55 crc kubenswrapper[4740]: I0105 13:59:55.915543 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:55 crc kubenswrapper[4740]: I0105 13:59:55.930288 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 13:59:55 crc kubenswrapper[4740]: I0105 13:59:55.942845 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" Jan 05 13:59:55 crc kubenswrapper[4740]: I0105 13:59:55.946757 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.144917 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247"] Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.146334 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.148705 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.149006 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.159620 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247"] Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.226550 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/187b3735-b06f-4a45-884d-f8ea41af0173-config-volume\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.226610 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqb5l\" (UniqueName: \"kubernetes.io/projected/187b3735-b06f-4a45-884d-f8ea41af0173-kube-api-access-qqb5l\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.226730 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/187b3735-b06f-4a45-884d-f8ea41af0173-secret-volume\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.327859 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/187b3735-b06f-4a45-884d-f8ea41af0173-config-volume\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.327982 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqb5l\" (UniqueName: \"kubernetes.io/projected/187b3735-b06f-4a45-884d-f8ea41af0173-kube-api-access-qqb5l\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.328189 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/187b3735-b06f-4a45-884d-f8ea41af0173-secret-volume\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.330006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/187b3735-b06f-4a45-884d-f8ea41af0173-config-volume\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.337680 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/187b3735-b06f-4a45-884d-f8ea41af0173-secret-volume\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.359620 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqb5l\" (UniqueName: \"kubernetes.io/projected/187b3735-b06f-4a45-884d-f8ea41af0173-kube-api-access-qqb5l\") pod \"collect-profiles-29460360-mr247\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.475037 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.892421 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247"] Jan 05 14:00:00 crc kubenswrapper[4740]: W0105 14:00:00.899660 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187b3735_b06f_4a45_884d_f8ea41af0173.slice/crio-75d8facb4946a30dbac6dbc53db572acf560ba5d08f8fda112a08604784ff69b WatchSource:0}: Error finding container 75d8facb4946a30dbac6dbc53db572acf560ba5d08f8fda112a08604784ff69b: Status 404 returned error can't find the container with id 75d8facb4946a30dbac6dbc53db572acf560ba5d08f8fda112a08604784ff69b Jan 05 14:00:00 crc kubenswrapper[4740]: I0105 14:00:00.959558 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" event={"ID":"187b3735-b06f-4a45-884d-f8ea41af0173","Type":"ContainerStarted","Data":"75d8facb4946a30dbac6dbc53db572acf560ba5d08f8fda112a08604784ff69b"} Jan 05 14:00:01 crc kubenswrapper[4740]: I0105 14:00:01.916655 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:00:01 crc kubenswrapper[4740]: I0105 14:00:01.917027 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:00:01 crc kubenswrapper[4740]: I0105 14:00:01.917133 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:00:01 crc kubenswrapper[4740]: I0105 14:00:01.918168 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c94fbb7c1e4a27a12915fd96b93743fa30aac1c7bed9369659cc71247bcbb496"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:00:01 crc kubenswrapper[4740]: I0105 14:00:01.918262 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://c94fbb7c1e4a27a12915fd96b93743fa30aac1c7bed9369659cc71247bcbb496" gracePeriod=600 Jan 05 14:00:04 crc kubenswrapper[4740]: I0105 14:00:04.003212 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="c94fbb7c1e4a27a12915fd96b93743fa30aac1c7bed9369659cc71247bcbb496" exitCode=0 Jan 05 14:00:04 crc kubenswrapper[4740]: I0105 14:00:04.003271 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"c94fbb7c1e4a27a12915fd96b93743fa30aac1c7bed9369659cc71247bcbb496"} Jan 05 14:00:04 crc kubenswrapper[4740]: I0105 14:00:04.004307 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"615f01064ee45ac723f59788df185ab69f9b600b12c150fcc649dcf97daf611a"} Jan 05 14:00:04 crc kubenswrapper[4740]: I0105 14:00:04.004351 4740 scope.go:117] "RemoveContainer" containerID="73243fe427b3c563b811bef4fe47899b7220b055a5ab1889d2817322cf522b18" Jan 05 14:00:04 crc kubenswrapper[4740]: I0105 14:00:04.009031 4740 generic.go:334] "Generic (PLEG): container finished" podID="187b3735-b06f-4a45-884d-f8ea41af0173" containerID="6320bd56c1666ea646aa87299c494e9c174c68db6e715868333a449cce98533b" exitCode=0 Jan 05 14:00:04 crc kubenswrapper[4740]: I0105 14:00:04.009121 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" event={"ID":"187b3735-b06f-4a45-884d-f8ea41af0173","Type":"ContainerDied","Data":"6320bd56c1666ea646aa87299c494e9c174c68db6e715868333a449cce98533b"} Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.432986 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.530822 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/187b3735-b06f-4a45-884d-f8ea41af0173-config-volume\") pod \"187b3735-b06f-4a45-884d-f8ea41af0173\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.530944 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqb5l\" (UniqueName: \"kubernetes.io/projected/187b3735-b06f-4a45-884d-f8ea41af0173-kube-api-access-qqb5l\") pod \"187b3735-b06f-4a45-884d-f8ea41af0173\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.531037 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/187b3735-b06f-4a45-884d-f8ea41af0173-secret-volume\") pod \"187b3735-b06f-4a45-884d-f8ea41af0173\" (UID: \"187b3735-b06f-4a45-884d-f8ea41af0173\") " Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.531612 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187b3735-b06f-4a45-884d-f8ea41af0173-config-volume" (OuterVolumeSpecName: "config-volume") pod "187b3735-b06f-4a45-884d-f8ea41af0173" (UID: "187b3735-b06f-4a45-884d-f8ea41af0173"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.531870 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/187b3735-b06f-4a45-884d-f8ea41af0173-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.537826 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187b3735-b06f-4a45-884d-f8ea41af0173-kube-api-access-qqb5l" (OuterVolumeSpecName: "kube-api-access-qqb5l") pod "187b3735-b06f-4a45-884d-f8ea41af0173" (UID: "187b3735-b06f-4a45-884d-f8ea41af0173"). InnerVolumeSpecName "kube-api-access-qqb5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.539359 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187b3735-b06f-4a45-884d-f8ea41af0173-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "187b3735-b06f-4a45-884d-f8ea41af0173" (UID: "187b3735-b06f-4a45-884d-f8ea41af0173"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.633534 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqb5l\" (UniqueName: \"kubernetes.io/projected/187b3735-b06f-4a45-884d-f8ea41af0173-kube-api-access-qqb5l\") on node \"crc\" DevicePath \"\"" Jan 05 14:00:05 crc kubenswrapper[4740]: I0105 14:00:05.633592 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/187b3735-b06f-4a45-884d-f8ea41af0173-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:00:06 crc kubenswrapper[4740]: I0105 14:00:06.036668 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" event={"ID":"187b3735-b06f-4a45-884d-f8ea41af0173","Type":"ContainerDied","Data":"75d8facb4946a30dbac6dbc53db572acf560ba5d08f8fda112a08604784ff69b"} Jan 05 14:00:06 crc kubenswrapper[4740]: I0105 14:00:06.037009 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75d8facb4946a30dbac6dbc53db572acf560ba5d08f8fda112a08604784ff69b" Jan 05 14:00:06 crc kubenswrapper[4740]: I0105 14:00:06.036770 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247" Jan 05 14:00:06 crc kubenswrapper[4740]: I0105 14:00:06.155905 4740 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 05 14:00:06 crc kubenswrapper[4740]: I0105 14:00:06.917818 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 14:00:07 crc kubenswrapper[4740]: I0105 14:00:07.112478 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 14:00:07 crc kubenswrapper[4740]: I0105 14:00:07.169121 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 14:00:08 crc kubenswrapper[4740]: I0105 14:00:08.165373 4740 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 05 14:00:08 crc kubenswrapper[4740]: I0105 14:00:08.165860 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 05 14:00:08 crc kubenswrapper[4740]: I0105 14:00:08.185023 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Jan 05 14:00:08 crc kubenswrapper[4740]: I0105 14:00:08.225822 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Jan 05 14:00:18 crc kubenswrapper[4740]: I0105 14:00:18.157137 4740 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 05 14:00:18 crc kubenswrapper[4740]: I0105 14:00:18.157807 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 05 14:00:28 crc kubenswrapper[4740]: I0105 14:00:28.155612 4740 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 05 14:00:28 crc kubenswrapper[4740]: I0105 14:00:28.156320 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 05 14:00:38 crc kubenswrapper[4740]: I0105 14:00:38.156659 4740 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 05 14:00:38 crc kubenswrapper[4740]: I0105 14:00:38.157347 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 05 14:00:48 crc kubenswrapper[4740]: I0105 14:00:48.159814 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.240679 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-9d54d"] Jan 05 14:01:07 crc kubenswrapper[4740]: E0105 14:01:07.241854 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187b3735-b06f-4a45-884d-f8ea41af0173" containerName="collect-profiles" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.241877 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="187b3735-b06f-4a45-884d-f8ea41af0173" containerName="collect-profiles" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.242182 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="187b3735-b06f-4a45-884d-f8ea41af0173" containerName="collect-profiles" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.243037 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.248402 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.248786 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-qvbcl" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.248918 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.249476 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.249631 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.268426 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.278156 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-9d54d"] Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.388175 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-9d54d"] Jan 05 14:01:07 crc kubenswrapper[4740]: E0105 14:01:07.388951 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-g89ss metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-9d54d" podUID="397ba221-10e4-4d8d-ab85-4c4156072b01" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.411919 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/397ba221-10e4-4d8d-ab85-4c4156072b01-tmp\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.411975 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-sa-token\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.411997 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-metrics\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412025 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-entrypoint\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412226 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config-openshift-service-cacrt\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412293 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g89ss\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-kube-api-access-g89ss\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412327 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-syslog-receiver\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412362 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-trusted-ca\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412392 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/397ba221-10e4-4d8d-ab85-4c4156072b01-datadir\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.412446 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-token\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.513814 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/397ba221-10e4-4d8d-ab85-4c4156072b01-tmp\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.513864 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-sa-token\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.513889 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-metrics\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.513917 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-entrypoint\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.513953 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config-openshift-service-cacrt\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.513977 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.514001 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g89ss\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-kube-api-access-g89ss\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.514029 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-syslog-receiver\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.514056 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-trusted-ca\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.514100 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/397ba221-10e4-4d8d-ab85-4c4156072b01-datadir\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.514140 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-token\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.515325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/397ba221-10e4-4d8d-ab85-4c4156072b01-datadir\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.515820 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config-openshift-service-cacrt\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.516287 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-trusted-ca\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.516542 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-entrypoint\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.516685 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.520202 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-token\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.520334 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/397ba221-10e4-4d8d-ab85-4c4156072b01-tmp\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.521232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-metrics\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.521337 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-syslog-receiver\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.532014 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-sa-token\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.535470 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g89ss\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-kube-api-access-g89ss\") pod \"collector-9d54d\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.641784 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.656466 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9d54d" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818381 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g89ss\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-kube-api-access-g89ss\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818425 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-sa-token\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818492 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/397ba221-10e4-4d8d-ab85-4c4156072b01-tmp\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818576 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-metrics\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818608 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-entrypoint\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818666 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-token\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818705 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-syslog-receiver\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818744 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config-openshift-service-cacrt\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818770 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-trusted-ca\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818806 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/397ba221-10e4-4d8d-ab85-4c4156072b01-datadir\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.818844 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config\") pod \"397ba221-10e4-4d8d-ab85-4c4156072b01\" (UID: \"397ba221-10e4-4d8d-ab85-4c4156072b01\") " Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.819481 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.819529 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.819536 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/397ba221-10e4-4d8d-ab85-4c4156072b01-datadir" (OuterVolumeSpecName: "datadir") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.819583 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.820052 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config" (OuterVolumeSpecName: "config") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.820340 4740 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-entrypoint\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.820362 4740 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.820376 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.820387 4740 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/397ba221-10e4-4d8d-ab85-4c4156072b01-datadir\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.820399 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/397ba221-10e4-4d8d-ab85-4c4156072b01-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.826208 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.826641 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-token" (OuterVolumeSpecName: "collector-token") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.827213 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-metrics" (OuterVolumeSpecName: "metrics") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.827221 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-sa-token" (OuterVolumeSpecName: "sa-token") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.827327 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-kube-api-access-g89ss" (OuterVolumeSpecName: "kube-api-access-g89ss") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "kube-api-access-g89ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.834170 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397ba221-10e4-4d8d-ab85-4c4156072b01-tmp" (OuterVolumeSpecName: "tmp") pod "397ba221-10e4-4d8d-ab85-4c4156072b01" (UID: "397ba221-10e4-4d8d-ab85-4c4156072b01"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.921612 4740 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/397ba221-10e4-4d8d-ab85-4c4156072b01-tmp\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.921641 4740 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-metrics\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.921651 4740 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-token\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.921661 4740 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/397ba221-10e4-4d8d-ab85-4c4156072b01-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.921671 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g89ss\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-kube-api-access-g89ss\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:07 crc kubenswrapper[4740]: I0105 14:01:07.921680 4740 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/397ba221-10e4-4d8d-ab85-4c4156072b01-sa-token\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.651209 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-9d54d" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.761982 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-9d54d"] Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.774945 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-bx9tf"] Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.776641 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.785964 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-9d54d"] Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.787466 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-qvbcl" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.787916 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.788244 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.789569 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.789836 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.803113 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.808175 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-bx9tf"] Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.939844 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-config-openshift-service-cacrt\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.939954 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ed4602b2-acd5-4357-98b4-b0b016dc8a61-datadir\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ed4602b2-acd5-4357-98b4-b0b016dc8a61-sa-token\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940216 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-entrypoint\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940303 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-config\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940348 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-collector-token\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940380 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4dr\" (UniqueName: \"kubernetes.io/projected/ed4602b2-acd5-4357-98b4-b0b016dc8a61-kube-api-access-gh4dr\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940547 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-trusted-ca\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940644 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-metrics\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed4602b2-acd5-4357-98b4-b0b016dc8a61-tmp\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.940803 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-collector-syslog-receiver\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:08 crc kubenswrapper[4740]: I0105 14:01:08.987309 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="397ba221-10e4-4d8d-ab85-4c4156072b01" path="/var/lib/kubelet/pods/397ba221-10e4-4d8d-ab85-4c4156072b01/volumes" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043485 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-config\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043578 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-collector-token\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043618 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4dr\" (UniqueName: \"kubernetes.io/projected/ed4602b2-acd5-4357-98b4-b0b016dc8a61-kube-api-access-gh4dr\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043683 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-trusted-ca\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043733 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-metrics\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed4602b2-acd5-4357-98b4-b0b016dc8a61-tmp\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043834 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-collector-syslog-receiver\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.043934 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-config-openshift-service-cacrt\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.044035 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ed4602b2-acd5-4357-98b4-b0b016dc8a61-datadir\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.044138 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ed4602b2-acd5-4357-98b4-b0b016dc8a61-sa-token\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.044210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-entrypoint\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.044209 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ed4602b2-acd5-4357-98b4-b0b016dc8a61-datadir\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.044852 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-config-openshift-service-cacrt\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.045113 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-trusted-ca\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.045203 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-entrypoint\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.045335 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4602b2-acd5-4357-98b4-b0b016dc8a61-config\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.048644 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-collector-syslog-receiver\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.053724 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-collector-token\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.060108 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ed4602b2-acd5-4357-98b4-b0b016dc8a61-metrics\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.060553 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ed4602b2-acd5-4357-98b4-b0b016dc8a61-sa-token\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.061298 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed4602b2-acd5-4357-98b4-b0b016dc8a61-tmp\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.063941 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4dr\" (UniqueName: \"kubernetes.io/projected/ed4602b2-acd5-4357-98b4-b0b016dc8a61-kube-api-access-gh4dr\") pod \"collector-bx9tf\" (UID: \"ed4602b2-acd5-4357-98b4-b0b016dc8a61\") " pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.116016 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bx9tf" Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.597921 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-bx9tf"] Jan 05 14:01:09 crc kubenswrapper[4740]: I0105 14:01:09.660043 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-bx9tf" event={"ID":"ed4602b2-acd5-4357-98b4-b0b016dc8a61","Type":"ContainerStarted","Data":"17bc6576a21ac6168d75226d75b8a5150ec779f6f98686465973dedd20ec4b39"} Jan 05 14:01:16 crc kubenswrapper[4740]: I0105 14:01:16.731555 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-bx9tf" event={"ID":"ed4602b2-acd5-4357-98b4-b0b016dc8a61","Type":"ContainerStarted","Data":"eccb344219b12edb1a3ba3ba919a3f7d3c5b257929a4ead584862623cd015433"} Jan 05 14:01:16 crc kubenswrapper[4740]: I0105 14:01:16.755675 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-bx9tf" podStartSLOduration=1.939928602 podStartE2EDuration="8.755657337s" podCreationTimestamp="2026-01-05 14:01:08 +0000 UTC" firstStartedPulling="2026-01-05 14:01:09.599754936 +0000 UTC m=+718.906663515" lastFinishedPulling="2026-01-05 14:01:16.415483681 +0000 UTC m=+725.722392250" observedRunningTime="2026-01-05 14:01:16.755471502 +0000 UTC m=+726.062380081" watchObservedRunningTime="2026-01-05 14:01:16.755657337 +0000 UTC m=+726.062565926" Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.834561 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94"] Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.839527 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.843784 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.846005 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94"] Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.897104 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.897900 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79k6h\" (UniqueName: \"kubernetes.io/projected/0dc34484-b121-476e-8aa8-e969485032b5-kube-api-access-79k6h\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:51 crc kubenswrapper[4740]: I0105 14:01:51.897993 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.000765 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.001244 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79k6h\" (UniqueName: \"kubernetes.io/projected/0dc34484-b121-476e-8aa8-e969485032b5-kube-api-access-79k6h\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.001362 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.001656 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.001895 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.027210 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79k6h\" (UniqueName: \"kubernetes.io/projected/0dc34484-b121-476e-8aa8-e969485032b5-kube-api-access-79k6h\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.161772 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:52 crc kubenswrapper[4740]: I0105 14:01:52.432394 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94"] Jan 05 14:01:52 crc kubenswrapper[4740]: W0105 14:01:52.445820 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc34484_b121_476e_8aa8_e969485032b5.slice/crio-13fb523b972298c0149081658e10fbee6e0e467ce15ad95cd2998a874440f0b6 WatchSource:0}: Error finding container 13fb523b972298c0149081658e10fbee6e0e467ce15ad95cd2998a874440f0b6: Status 404 returned error can't find the container with id 13fb523b972298c0149081658e10fbee6e0e467ce15ad95cd2998a874440f0b6 Jan 05 14:01:53 crc kubenswrapper[4740]: I0105 14:01:53.097323 4740 generic.go:334] "Generic (PLEG): container finished" podID="0dc34484-b121-476e-8aa8-e969485032b5" containerID="f603c362fa8b75e8607ad76cdc0cae3082eb84c6abb43ee07c5e240970e1612c" exitCode=0 Jan 05 14:01:53 crc kubenswrapper[4740]: I0105 14:01:53.097436 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" event={"ID":"0dc34484-b121-476e-8aa8-e969485032b5","Type":"ContainerDied","Data":"f603c362fa8b75e8607ad76cdc0cae3082eb84c6abb43ee07c5e240970e1612c"} Jan 05 14:01:53 crc kubenswrapper[4740]: I0105 14:01:53.097643 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" event={"ID":"0dc34484-b121-476e-8aa8-e969485032b5","Type":"ContainerStarted","Data":"13fb523b972298c0149081658e10fbee6e0e467ce15ad95cd2998a874440f0b6"} Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.166085 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpvh6"] Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.169520 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.221009 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpvh6"] Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.257897 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxwmw\" (UniqueName: \"kubernetes.io/projected/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-kube-api-access-hxwmw\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.258250 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-catalog-content\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.258295 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-utilities\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.359553 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxwmw\" (UniqueName: \"kubernetes.io/projected/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-kube-api-access-hxwmw\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.359611 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-catalog-content\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.359648 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-utilities\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.360235 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-catalog-content\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.360319 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-utilities\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.398136 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxwmw\" (UniqueName: \"kubernetes.io/projected/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-kube-api-access-hxwmw\") pod \"redhat-operators-tpvh6\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.507302 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:01:54 crc kubenswrapper[4740]: I0105 14:01:54.975455 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpvh6"] Jan 05 14:01:55 crc kubenswrapper[4740]: I0105 14:01:55.114257 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerStarted","Data":"870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a"} Jan 05 14:01:55 crc kubenswrapper[4740]: I0105 14:01:55.114300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerStarted","Data":"aaeadaa3774a767adedcb1950e3b5f8d7c59a9eed14cc49caaf2f61457943baf"} Jan 05 14:01:55 crc kubenswrapper[4740]: I0105 14:01:55.118578 4740 generic.go:334] "Generic (PLEG): container finished" podID="0dc34484-b121-476e-8aa8-e969485032b5" containerID="c6c69972123daa7a2e14c33b261cd47cc4e782d45fcdfac021f2d58fe577329b" exitCode=0 Jan 05 14:01:55 crc kubenswrapper[4740]: I0105 14:01:55.118648 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" event={"ID":"0dc34484-b121-476e-8aa8-e969485032b5","Type":"ContainerDied","Data":"c6c69972123daa7a2e14c33b261cd47cc4e782d45fcdfac021f2d58fe577329b"} Jan 05 14:01:56 crc kubenswrapper[4740]: I0105 14:01:56.128431 4740 generic.go:334] "Generic (PLEG): container finished" podID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerID="870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a" exitCode=0 Jan 05 14:01:56 crc kubenswrapper[4740]: I0105 14:01:56.128544 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerDied","Data":"870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a"} Jan 05 14:01:56 crc kubenswrapper[4740]: I0105 14:01:56.132185 4740 generic.go:334] "Generic (PLEG): container finished" podID="0dc34484-b121-476e-8aa8-e969485032b5" containerID="bdb79bbe3390ee3fa2550b16e3ee21bc75321e33f0cb82008cd6a19ed23dc146" exitCode=0 Jan 05 14:01:56 crc kubenswrapper[4740]: I0105 14:01:56.132250 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" event={"ID":"0dc34484-b121-476e-8aa8-e969485032b5","Type":"ContainerDied","Data":"bdb79bbe3390ee3fa2550b16e3ee21bc75321e33f0cb82008cd6a19ed23dc146"} Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.399247 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.534845 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-util\") pod \"0dc34484-b121-476e-8aa8-e969485032b5\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.534960 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79k6h\" (UniqueName: \"kubernetes.io/projected/0dc34484-b121-476e-8aa8-e969485032b5-kube-api-access-79k6h\") pod \"0dc34484-b121-476e-8aa8-e969485032b5\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.535048 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-bundle\") pod \"0dc34484-b121-476e-8aa8-e969485032b5\" (UID: \"0dc34484-b121-476e-8aa8-e969485032b5\") " Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.535662 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-bundle" (OuterVolumeSpecName: "bundle") pod "0dc34484-b121-476e-8aa8-e969485032b5" (UID: "0dc34484-b121-476e-8aa8-e969485032b5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.544295 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc34484-b121-476e-8aa8-e969485032b5-kube-api-access-79k6h" (OuterVolumeSpecName: "kube-api-access-79k6h") pod "0dc34484-b121-476e-8aa8-e969485032b5" (UID: "0dc34484-b121-476e-8aa8-e969485032b5"). InnerVolumeSpecName "kube-api-access-79k6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.552784 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-util" (OuterVolumeSpecName: "util") pod "0dc34484-b121-476e-8aa8-e969485032b5" (UID: "0dc34484-b121-476e-8aa8-e969485032b5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.636666 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-util\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.636699 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79k6h\" (UniqueName: \"kubernetes.io/projected/0dc34484-b121-476e-8aa8-e969485032b5-kube-api-access-79k6h\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:57 crc kubenswrapper[4740]: I0105 14:01:57.636714 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dc34484-b121-476e-8aa8-e969485032b5-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:01:58 crc kubenswrapper[4740]: I0105 14:01:58.151273 4740 generic.go:334] "Generic (PLEG): container finished" podID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerID="3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047" exitCode=0 Jan 05 14:01:58 crc kubenswrapper[4740]: I0105 14:01:58.151383 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerDied","Data":"3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047"} Jan 05 14:01:58 crc kubenswrapper[4740]: I0105 14:01:58.158220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" event={"ID":"0dc34484-b121-476e-8aa8-e969485032b5","Type":"ContainerDied","Data":"13fb523b972298c0149081658e10fbee6e0e467ce15ad95cd2998a874440f0b6"} Jan 05 14:01:58 crc kubenswrapper[4740]: I0105 14:01:58.158261 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13fb523b972298c0149081658e10fbee6e0e467ce15ad95cd2998a874440f0b6" Jan 05 14:01:58 crc kubenswrapper[4740]: I0105 14:01:58.158343 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94" Jan 05 14:01:59 crc kubenswrapper[4740]: I0105 14:01:59.169583 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerStarted","Data":"2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a"} Jan 05 14:01:59 crc kubenswrapper[4740]: I0105 14:01:59.204439 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpvh6" podStartSLOduration=2.572051482 podStartE2EDuration="5.204417822s" podCreationTimestamp="2026-01-05 14:01:54 +0000 UTC" firstStartedPulling="2026-01-05 14:01:56.130801968 +0000 UTC m=+765.437710577" lastFinishedPulling="2026-01-05 14:01:58.763168338 +0000 UTC m=+768.070076917" observedRunningTime="2026-01-05 14:01:59.195736832 +0000 UTC m=+768.502645421" watchObservedRunningTime="2026-01-05 14:01:59.204417822 +0000 UTC m=+768.511326411" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.981195 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-bg7t9"] Jan 05 14:02:00 crc kubenswrapper[4740]: E0105 14:02:00.981687 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="util" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.981714 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="util" Jan 05 14:02:00 crc kubenswrapper[4740]: E0105 14:02:00.981740 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="extract" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.981746 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="extract" Jan 05 14:02:00 crc kubenswrapper[4740]: E0105 14:02:00.981762 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="pull" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.981768 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="pull" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.981926 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc34484-b121-476e-8aa8-e969485032b5" containerName="extract" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.982482 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.985556 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.985601 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 05 14:02:00 crc kubenswrapper[4740]: I0105 14:02:00.985906 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-sffvf" Jan 05 14:02:01 crc kubenswrapper[4740]: I0105 14:02:01.009290 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-bg7t9"] Jan 05 14:02:01 crc kubenswrapper[4740]: I0105 14:02:01.097054 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcmrv\" (UniqueName: \"kubernetes.io/projected/49e4bc2c-7786-4f56-8ec8-650c1a388d67-kube-api-access-bcmrv\") pod \"nmstate-operator-6769fb99d-bg7t9\" (UID: \"49e4bc2c-7786-4f56-8ec8-650c1a388d67\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" Jan 05 14:02:01 crc kubenswrapper[4740]: I0105 14:02:01.199200 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcmrv\" (UniqueName: \"kubernetes.io/projected/49e4bc2c-7786-4f56-8ec8-650c1a388d67-kube-api-access-bcmrv\") pod \"nmstate-operator-6769fb99d-bg7t9\" (UID: \"49e4bc2c-7786-4f56-8ec8-650c1a388d67\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" Jan 05 14:02:01 crc kubenswrapper[4740]: I0105 14:02:01.223264 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcmrv\" (UniqueName: \"kubernetes.io/projected/49e4bc2c-7786-4f56-8ec8-650c1a388d67-kube-api-access-bcmrv\") pod \"nmstate-operator-6769fb99d-bg7t9\" (UID: \"49e4bc2c-7786-4f56-8ec8-650c1a388d67\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" Jan 05 14:02:01 crc kubenswrapper[4740]: I0105 14:02:01.297817 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" Jan 05 14:02:01 crc kubenswrapper[4740]: I0105 14:02:01.710218 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-bg7t9"] Jan 05 14:02:01 crc kubenswrapper[4740]: W0105 14:02:01.712980 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49e4bc2c_7786_4f56_8ec8_650c1a388d67.slice/crio-6e98c1817f76c4e8c5171759136e034c2d6d3496a33e0b0cee387f354c400db0 WatchSource:0}: Error finding container 6e98c1817f76c4e8c5171759136e034c2d6d3496a33e0b0cee387f354c400db0: Status 404 returned error can't find the container with id 6e98c1817f76c4e8c5171759136e034c2d6d3496a33e0b0cee387f354c400db0 Jan 05 14:02:02 crc kubenswrapper[4740]: I0105 14:02:02.188981 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" event={"ID":"49e4bc2c-7786-4f56-8ec8-650c1a388d67","Type":"ContainerStarted","Data":"6e98c1817f76c4e8c5171759136e034c2d6d3496a33e0b0cee387f354c400db0"} Jan 05 14:02:04 crc kubenswrapper[4740]: I0105 14:02:04.507934 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:02:04 crc kubenswrapper[4740]: I0105 14:02:04.509213 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:02:05 crc kubenswrapper[4740]: I0105 14:02:05.218457 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" event={"ID":"49e4bc2c-7786-4f56-8ec8-650c1a388d67","Type":"ContainerStarted","Data":"2eeb0ee33a55f14d095a674cea080b78218abf604b350f25b0fa4651a558d23a"} Jan 05 14:02:05 crc kubenswrapper[4740]: I0105 14:02:05.246534 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-bg7t9" podStartSLOduration=2.339120186 podStartE2EDuration="5.246511212s" podCreationTimestamp="2026-01-05 14:02:00 +0000 UTC" firstStartedPulling="2026-01-05 14:02:01.715309592 +0000 UTC m=+771.022218171" lastFinishedPulling="2026-01-05 14:02:04.622700618 +0000 UTC m=+773.929609197" observedRunningTime="2026-01-05 14:02:05.243703348 +0000 UTC m=+774.550611987" watchObservedRunningTime="2026-01-05 14:02:05.246511212 +0000 UTC m=+774.553419831" Jan 05 14:02:05 crc kubenswrapper[4740]: I0105 14:02:05.553145 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpvh6" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="registry-server" probeResult="failure" output=< Jan 05 14:02:05 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:02:05 crc kubenswrapper[4740]: > Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.286007 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.287835 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.289796 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bwpc9" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.301683 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.311326 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.315112 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.326222 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.362994 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.370763 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-58hgk"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.371990 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.398284 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e6df01b0-f4c2-49c2-982d-4b814fd5d493-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.398331 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zj94\" (UniqueName: \"kubernetes.io/projected/e6df01b0-f4c2-49c2-982d-4b814fd5d493-kube-api-access-6zj94\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.398367 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcjf\" (UniqueName: \"kubernetes.io/projected/7b892142-9e83-40e5-b305-28c1c3dde6d5-kube-api-access-6fcjf\") pod \"nmstate-metrics-7f7f7578db-qpc9m\" (UID: \"7b892142-9e83-40e5-b305-28c1c3dde6d5\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.461835 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.462859 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.464620 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.465885 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.466032 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-nqsdb" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.468825 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502348 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-dbus-socket\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502487 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-nmstate-lock\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502549 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e6df01b0-f4c2-49c2-982d-4b814fd5d493-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502609 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zj94\" (UniqueName: \"kubernetes.io/projected/e6df01b0-f4c2-49c2-982d-4b814fd5d493-kube-api-access-6zj94\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502651 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbd6\" (UniqueName: \"kubernetes.io/projected/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-kube-api-access-xvbd6\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502680 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-ovs-socket\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.502719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcjf\" (UniqueName: \"kubernetes.io/projected/7b892142-9e83-40e5-b305-28c1c3dde6d5-kube-api-access-6fcjf\") pod \"nmstate-metrics-7f7f7578db-qpc9m\" (UID: \"7b892142-9e83-40e5-b305-28c1c3dde6d5\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" Jan 05 14:02:06 crc kubenswrapper[4740]: E0105 14:02:06.503313 4740 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 05 14:02:06 crc kubenswrapper[4740]: E0105 14:02:06.503379 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6df01b0-f4c2-49c2-982d-4b814fd5d493-tls-key-pair podName:e6df01b0-f4c2-49c2-982d-4b814fd5d493 nodeName:}" failed. No retries permitted until 2026-01-05 14:02:07.003358144 +0000 UTC m=+776.310266723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e6df01b0-f4c2-49c2-982d-4b814fd5d493-tls-key-pair") pod "nmstate-webhook-f8fb84555-zvv7k" (UID: "e6df01b0-f4c2-49c2-982d-4b814fd5d493") : secret "openshift-nmstate-webhook" not found Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.526200 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcjf\" (UniqueName: \"kubernetes.io/projected/7b892142-9e83-40e5-b305-28c1c3dde6d5-kube-api-access-6fcjf\") pod \"nmstate-metrics-7f7f7578db-qpc9m\" (UID: \"7b892142-9e83-40e5-b305-28c1c3dde6d5\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.545110 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zj94\" (UniqueName: \"kubernetes.io/projected/e6df01b0-f4c2-49c2-982d-4b814fd5d493-kube-api-access-6zj94\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.603608 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604209 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f49ae0c4-aa16-448b-859d-45ed8809ac9d-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604290 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49ae0c4-aa16-448b-859d-45ed8809ac9d-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604373 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-nmstate-lock\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604443 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-nmstate-lock\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604619 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6f6v\" (UniqueName: \"kubernetes.io/projected/f49ae0c4-aa16-448b-859d-45ed8809ac9d-kube-api-access-p6f6v\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604694 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvbd6\" (UniqueName: \"kubernetes.io/projected/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-kube-api-access-xvbd6\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-ovs-socket\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604835 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-dbus-socket\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.604846 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-ovs-socket\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.605035 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-dbus-socket\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.626455 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvbd6\" (UniqueName: \"kubernetes.io/projected/a4bfce1d-9af3-49f5-877e-b5ea29088ac7-kube-api-access-xvbd6\") pod \"nmstate-handler-58hgk\" (UID: \"a4bfce1d-9af3-49f5-877e-b5ea29088ac7\") " pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.653499 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bbc7d56d-v8cvv"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.654542 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.674718 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bbc7d56d-v8cvv"] Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.692259 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706690 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-config\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706746 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f49ae0c4-aa16-448b-859d-45ed8809ac9d-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706783 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49ae0c4-aa16-448b-859d-45ed8809ac9d-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706804 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-service-ca\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706840 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-oauth-serving-cert\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706876 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-oauth-config\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706906 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6f6v\" (UniqueName: \"kubernetes.io/projected/f49ae0c4-aa16-448b-859d-45ed8809ac9d-kube-api-access-p6f6v\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706940 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-trusted-ca-bundle\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-serving-cert\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.706983 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2d8\" (UniqueName: \"kubernetes.io/projected/8ea2f4af-f899-4832-98f1-e56e7665d2ba-kube-api-access-8n2d8\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.708010 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f49ae0c4-aa16-448b-859d-45ed8809ac9d-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.719791 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49ae0c4-aa16-448b-859d-45ed8809ac9d-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.734625 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6f6v\" (UniqueName: \"kubernetes.io/projected/f49ae0c4-aa16-448b-859d-45ed8809ac9d-kube-api-access-p6f6v\") pod \"nmstate-console-plugin-6ff7998486-b7fh2\" (UID: \"f49ae0c4-aa16-448b-859d-45ed8809ac9d\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.779389 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.809666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-config\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.809737 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-service-ca\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.809774 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-oauth-serving-cert\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.809814 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-oauth-config\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.809853 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-trusted-ca-bundle\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.810179 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-serving-cert\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.810247 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2d8\" (UniqueName: \"kubernetes.io/projected/8ea2f4af-f899-4832-98f1-e56e7665d2ba-kube-api-access-8n2d8\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.811135 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-oauth-serving-cert\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.811492 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-config\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.811633 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-service-ca\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.811803 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-trusted-ca-bundle\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.815658 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-oauth-config\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.816496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-serving-cert\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.826190 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2d8\" (UniqueName: \"kubernetes.io/projected/8ea2f4af-f899-4832-98f1-e56e7665d2ba-kube-api-access-8n2d8\") pod \"console-6bbc7d56d-v8cvv\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:06 crc kubenswrapper[4740]: I0105 14:02:06.992926 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.013925 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e6df01b0-f4c2-49c2-982d-4b814fd5d493-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.016870 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e6df01b0-f4c2-49c2-982d-4b814fd5d493-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-zvv7k\" (UID: \"e6df01b0-f4c2-49c2-982d-4b814fd5d493\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.100649 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m"] Jan 05 14:02:07 crc kubenswrapper[4740]: W0105 14:02:07.119378 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b892142_9e83_40e5_b305_28c1c3dde6d5.slice/crio-cc1dc0b81b0d16c69f02a8014872108501f40a690048007acc6c5a6dd79aaefa WatchSource:0}: Error finding container cc1dc0b81b0d16c69f02a8014872108501f40a690048007acc6c5a6dd79aaefa: Status 404 returned error can't find the container with id cc1dc0b81b0d16c69f02a8014872108501f40a690048007acc6c5a6dd79aaefa Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.214439 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2"] Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.233079 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" event={"ID":"f49ae0c4-aa16-448b-859d-45ed8809ac9d","Type":"ContainerStarted","Data":"3665407c86a418cb7e3c7d63099fc544f59c0bc9cc9b4bab5a5df455e026eb06"} Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.234493 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-58hgk" event={"ID":"a4bfce1d-9af3-49f5-877e-b5ea29088ac7","Type":"ContainerStarted","Data":"aef69001ac5298092e8007cfe8882aed28825bbc3e7c50739e7f60d5514b930b"} Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.235567 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" event={"ID":"7b892142-9e83-40e5-b305-28c1c3dde6d5","Type":"ContainerStarted","Data":"cc1dc0b81b0d16c69f02a8014872108501f40a690048007acc6c5a6dd79aaefa"} Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.237014 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.438731 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bbc7d56d-v8cvv"] Jan 05 14:02:07 crc kubenswrapper[4740]: I0105 14:02:07.502941 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k"] Jan 05 14:02:07 crc kubenswrapper[4740]: W0105 14:02:07.515412 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6df01b0_f4c2_49c2_982d_4b814fd5d493.slice/crio-5bee3ea7227a40c2a13fab23c3f58bca898c4af2529a97fc33a5e68c09b9ad46 WatchSource:0}: Error finding container 5bee3ea7227a40c2a13fab23c3f58bca898c4af2529a97fc33a5e68c09b9ad46: Status 404 returned error can't find the container with id 5bee3ea7227a40c2a13fab23c3f58bca898c4af2529a97fc33a5e68c09b9ad46 Jan 05 14:02:08 crc kubenswrapper[4740]: I0105 14:02:08.247642 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" event={"ID":"e6df01b0-f4c2-49c2-982d-4b814fd5d493","Type":"ContainerStarted","Data":"5bee3ea7227a40c2a13fab23c3f58bca898c4af2529a97fc33a5e68c09b9ad46"} Jan 05 14:02:08 crc kubenswrapper[4740]: I0105 14:02:08.249773 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbc7d56d-v8cvv" event={"ID":"8ea2f4af-f899-4832-98f1-e56e7665d2ba","Type":"ContainerStarted","Data":"8d4c6e477b93e27520954359820f3f9f67bc6f598d363d22ec23f63f17a55cc3"} Jan 05 14:02:08 crc kubenswrapper[4740]: I0105 14:02:08.249823 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbc7d56d-v8cvv" event={"ID":"8ea2f4af-f899-4832-98f1-e56e7665d2ba","Type":"ContainerStarted","Data":"70536a2b66e03324823bf68de0d7a448842aa5a0c22133bae87b1574c6eb8a50"} Jan 05 14:02:08 crc kubenswrapper[4740]: I0105 14:02:08.291144 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bbc7d56d-v8cvv" podStartSLOduration=2.291111557 podStartE2EDuration="2.291111557s" podCreationTimestamp="2026-01-05 14:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:02:08.276603452 +0000 UTC m=+777.583512081" watchObservedRunningTime="2026-01-05 14:02:08.291111557 +0000 UTC m=+777.598020166" Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.289121 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" event={"ID":"e6df01b0-f4c2-49c2-982d-4b814fd5d493","Type":"ContainerStarted","Data":"de6423454c8208d3a09595a026716c6445423f0eaafc6101f7cd4b1640b7866d"} Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.290267 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.291831 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-58hgk" event={"ID":"a4bfce1d-9af3-49f5-877e-b5ea29088ac7","Type":"ContainerStarted","Data":"1400a59a4dd9a6e24e5a6872b3c306d29c7cd45e7d664a7608beb186b57b8313"} Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.292081 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.294858 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" event={"ID":"7b892142-9e83-40e5-b305-28c1c3dde6d5","Type":"ContainerStarted","Data":"eeed3dbc3eb225144b002ae1c7166881374eb73443f88294cb2659291770e326"} Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.296863 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" event={"ID":"f49ae0c4-aa16-448b-859d-45ed8809ac9d","Type":"ContainerStarted","Data":"c1ba3103e4c75e15d27ce8713a78d33b0bd5dc56838e81561a343d6b4fde8172"} Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.318152 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" podStartSLOduration=2.280557217 podStartE2EDuration="6.318129819s" podCreationTimestamp="2026-01-05 14:02:06 +0000 UTC" firstStartedPulling="2026-01-05 14:02:07.518717335 +0000 UTC m=+776.825625914" lastFinishedPulling="2026-01-05 14:02:11.556289937 +0000 UTC m=+780.863198516" observedRunningTime="2026-01-05 14:02:12.314715648 +0000 UTC m=+781.621624247" watchObservedRunningTime="2026-01-05 14:02:12.318129819 +0000 UTC m=+781.625038428" Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.346991 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-58hgk" podStartSLOduration=1.518949792 podStartE2EDuration="6.346970034s" podCreationTimestamp="2026-01-05 14:02:06 +0000 UTC" firstStartedPulling="2026-01-05 14:02:06.739805811 +0000 UTC m=+776.046714380" lastFinishedPulling="2026-01-05 14:02:11.567826003 +0000 UTC m=+780.874734622" observedRunningTime="2026-01-05 14:02:12.333486156 +0000 UTC m=+781.640394735" watchObservedRunningTime="2026-01-05 14:02:12.346970034 +0000 UTC m=+781.653878623" Jan 05 14:02:12 crc kubenswrapper[4740]: I0105 14:02:12.349155 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b7fh2" podStartSLOduration=2.006708279 podStartE2EDuration="6.349145521s" podCreationTimestamp="2026-01-05 14:02:06 +0000 UTC" firstStartedPulling="2026-01-05 14:02:07.212881299 +0000 UTC m=+776.519789878" lastFinishedPulling="2026-01-05 14:02:11.555318501 +0000 UTC m=+780.862227120" observedRunningTime="2026-01-05 14:02:12.348323399 +0000 UTC m=+781.655231988" watchObservedRunningTime="2026-01-05 14:02:12.349145521 +0000 UTC m=+781.656054110" Jan 05 14:02:14 crc kubenswrapper[4740]: I0105 14:02:14.568295 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:02:14 crc kubenswrapper[4740]: I0105 14:02:14.655912 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:02:14 crc kubenswrapper[4740]: I0105 14:02:14.808903 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpvh6"] Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.350001 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" event={"ID":"7b892142-9e83-40e5-b305-28c1c3dde6d5","Type":"ContainerStarted","Data":"7e19ea66284581bc0d35fbc04b841845f4ed32b45d580c64bed4c01d0a78cd43"} Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.350501 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpvh6" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="registry-server" containerID="cri-o://2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a" gracePeriod=2 Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.742676 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-58hgk" Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.775738 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-qpc9m" podStartSLOduration=2.641073333 podStartE2EDuration="10.775720774s" podCreationTimestamp="2026-01-05 14:02:06 +0000 UTC" firstStartedPulling="2026-01-05 14:02:07.130103825 +0000 UTC m=+776.437012404" lastFinishedPulling="2026-01-05 14:02:15.264751276 +0000 UTC m=+784.571659845" observedRunningTime="2026-01-05 14:02:16.399523012 +0000 UTC m=+785.706431661" watchObservedRunningTime="2026-01-05 14:02:16.775720774 +0000 UTC m=+786.082629363" Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.879660 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.993322 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:16 crc kubenswrapper[4740]: I0105 14:02:16.993377 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.002186 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.012815 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxwmw\" (UniqueName: \"kubernetes.io/projected/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-kube-api-access-hxwmw\") pod \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.012927 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-utilities\") pod \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.013117 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-catalog-content\") pod \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\" (UID: \"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1\") " Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.014740 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-utilities" (OuterVolumeSpecName: "utilities") pod "8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" (UID: "8a9adf60-c879-46ae-b05d-e5d9d93bc1e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.021629 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-kube-api-access-hxwmw" (OuterVolumeSpecName: "kube-api-access-hxwmw") pod "8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" (UID: "8a9adf60-c879-46ae-b05d-e5d9d93bc1e1"). InnerVolumeSpecName "kube-api-access-hxwmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.116471 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxwmw\" (UniqueName: \"kubernetes.io/projected/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-kube-api-access-hxwmw\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.116520 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.173531 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" (UID: "8a9adf60-c879-46ae-b05d-e5d9d93bc1e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.217773 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.363019 4740 generic.go:334] "Generic (PLEG): container finished" podID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerID="2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a" exitCode=0 Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.363092 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpvh6" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.363230 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerDied","Data":"2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a"} Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.363310 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpvh6" event={"ID":"8a9adf60-c879-46ae-b05d-e5d9d93bc1e1","Type":"ContainerDied","Data":"aaeadaa3774a767adedcb1950e3b5f8d7c59a9eed14cc49caaf2f61457943baf"} Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.363344 4740 scope.go:117] "RemoveContainer" containerID="2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.369640 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.404476 4740 scope.go:117] "RemoveContainer" containerID="3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.473975 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpvh6"] Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.479509 4740 scope.go:117] "RemoveContainer" containerID="870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.487227 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpvh6"] Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.511908 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75f75bdd6-kzlkk"] Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.523285 4740 scope.go:117] "RemoveContainer" containerID="2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a" Jan 05 14:02:17 crc kubenswrapper[4740]: E0105 14:02:17.528925 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a\": container with ID starting with 2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a not found: ID does not exist" containerID="2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.529248 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a"} err="failed to get container status \"2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a\": rpc error: code = NotFound desc = could not find container \"2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a\": container with ID starting with 2b4617179a1a9eeda3d720ff17c3839ed22895ff8b1158ff1e1446b90e802c6a not found: ID does not exist" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.529275 4740 scope.go:117] "RemoveContainer" containerID="3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047" Jan 05 14:02:17 crc kubenswrapper[4740]: E0105 14:02:17.531602 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047\": container with ID starting with 3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047 not found: ID does not exist" containerID="3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.531634 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047"} err="failed to get container status \"3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047\": rpc error: code = NotFound desc = could not find container \"3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047\": container with ID starting with 3301d39a25fa4c4017afb333e3c453f91a926d1b6fba97c067fffaa2cdf95047 not found: ID does not exist" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.531652 4740 scope.go:117] "RemoveContainer" containerID="870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a" Jan 05 14:02:17 crc kubenswrapper[4740]: E0105 14:02:17.531982 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a\": container with ID starting with 870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a not found: ID does not exist" containerID="870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a" Jan 05 14:02:17 crc kubenswrapper[4740]: I0105 14:02:17.532005 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a"} err="failed to get container status \"870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a\": rpc error: code = NotFound desc = could not find container \"870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a\": container with ID starting with 870fe20c480f050eb13dd8ea0db313a35862cb057ba57916ad86da36b852b29a not found: ID does not exist" Jan 05 14:02:18 crc kubenswrapper[4740]: I0105 14:02:18.985698 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" path="/var/lib/kubelet/pods/8a9adf60-c879-46ae-b05d-e5d9d93bc1e1/volumes" Jan 05 14:02:27 crc kubenswrapper[4740]: I0105 14:02:27.246782 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" Jan 05 14:02:31 crc kubenswrapper[4740]: I0105 14:02:31.916747 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:02:31 crc kubenswrapper[4740]: I0105 14:02:31.917637 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:02:34 crc kubenswrapper[4740]: I0105 14:02:34.996816 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fgbrz"] Jan 05 14:02:34 crc kubenswrapper[4740]: E0105 14:02:34.997999 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="registry-server" Jan 05 14:02:34 crc kubenswrapper[4740]: I0105 14:02:34.998045 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="registry-server" Jan 05 14:02:34 crc kubenswrapper[4740]: E0105 14:02:34.998108 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="extract-utilities" Jan 05 14:02:34 crc kubenswrapper[4740]: I0105 14:02:34.998123 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="extract-utilities" Jan 05 14:02:34 crc kubenswrapper[4740]: E0105 14:02:34.998160 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="extract-content" Jan 05 14:02:34 crc kubenswrapper[4740]: I0105 14:02:34.998175 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="extract-content" Jan 05 14:02:34 crc kubenswrapper[4740]: I0105 14:02:34.998460 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9adf60-c879-46ae-b05d-e5d9d93bc1e1" containerName="registry-server" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.000541 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.028996 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgbrz"] Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.109214 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qg8w\" (UniqueName: \"kubernetes.io/projected/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-kube-api-access-7qg8w\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.109292 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-utilities\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.110596 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-catalog-content\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.212183 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qg8w\" (UniqueName: \"kubernetes.io/projected/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-kube-api-access-7qg8w\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.212239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-utilities\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.212321 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-catalog-content\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.212819 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-catalog-content\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.212828 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-utilities\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.242276 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qg8w\" (UniqueName: \"kubernetes.io/projected/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-kube-api-access-7qg8w\") pod \"community-operators-fgbrz\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.320805 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:35 crc kubenswrapper[4740]: I0105 14:02:35.637051 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgbrz"] Jan 05 14:02:36 crc kubenswrapper[4740]: I0105 14:02:36.651123 4740 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerID="013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26" exitCode=0 Jan 05 14:02:36 crc kubenswrapper[4740]: I0105 14:02:36.651222 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgbrz" event={"ID":"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66","Type":"ContainerDied","Data":"013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26"} Jan 05 14:02:36 crc kubenswrapper[4740]: I0105 14:02:36.651523 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgbrz" event={"ID":"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66","Type":"ContainerStarted","Data":"e27b9ba30a927676812bb0e6be4796f2f1a7412ff4aab8a3c838f56253ccea24"} Jan 05 14:02:36 crc kubenswrapper[4740]: I0105 14:02:36.654006 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:02:38 crc kubenswrapper[4740]: I0105 14:02:38.684016 4740 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerID="6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b" exitCode=0 Jan 05 14:02:38 crc kubenswrapper[4740]: I0105 14:02:38.684480 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgbrz" event={"ID":"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66","Type":"ContainerDied","Data":"6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b"} Jan 05 14:02:39 crc kubenswrapper[4740]: I0105 14:02:39.695706 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgbrz" event={"ID":"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66","Type":"ContainerStarted","Data":"bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07"} Jan 05 14:02:39 crc kubenswrapper[4740]: I0105 14:02:39.719502 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fgbrz" podStartSLOduration=3.012346117 podStartE2EDuration="5.719486878s" podCreationTimestamp="2026-01-05 14:02:34 +0000 UTC" firstStartedPulling="2026-01-05 14:02:36.653788464 +0000 UTC m=+805.960697033" lastFinishedPulling="2026-01-05 14:02:39.360929175 +0000 UTC m=+808.667837794" observedRunningTime="2026-01-05 14:02:39.717188937 +0000 UTC m=+809.024097516" watchObservedRunningTime="2026-01-05 14:02:39.719486878 +0000 UTC m=+809.026395457" Jan 05 14:02:42 crc kubenswrapper[4740]: I0105 14:02:42.587382 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-75f75bdd6-kzlkk" podUID="8e9a546c-882a-4c69-b739-d682512dd39e" containerName="console" containerID="cri-o://df1063e86f8d755dfa3e8556f20d16100e784c7c0c12876cfe4c1c7e48d828eb" gracePeriod=15 Jan 05 14:02:42 crc kubenswrapper[4740]: I0105 14:02:42.731197 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75f75bdd6-kzlkk_8e9a546c-882a-4c69-b739-d682512dd39e/console/0.log" Jan 05 14:02:42 crc kubenswrapper[4740]: I0105 14:02:42.731576 4740 generic.go:334] "Generic (PLEG): container finished" podID="8e9a546c-882a-4c69-b739-d682512dd39e" containerID="df1063e86f8d755dfa3e8556f20d16100e784c7c0c12876cfe4c1c7e48d828eb" exitCode=2 Jan 05 14:02:42 crc kubenswrapper[4740]: I0105 14:02:42.731633 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f75bdd6-kzlkk" event={"ID":"8e9a546c-882a-4c69-b739-d682512dd39e","Type":"ContainerDied","Data":"df1063e86f8d755dfa3e8556f20d16100e784c7c0c12876cfe4c1c7e48d828eb"} Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.048477 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75f75bdd6-kzlkk_8e9a546c-882a-4c69-b739-d682512dd39e/console/0.log" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.048857 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.178520 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-console-config\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.178880 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-oauth-config\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.178943 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-serving-cert\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.178965 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-service-ca\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.179043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-oauth-serving-cert\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.179087 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-trusted-ca-bundle\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.179119 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjnvs\" (UniqueName: \"kubernetes.io/projected/8e9a546c-882a-4c69-b739-d682512dd39e-kube-api-access-hjnvs\") pod \"8e9a546c-882a-4c69-b739-d682512dd39e\" (UID: \"8e9a546c-882a-4c69-b739-d682512dd39e\") " Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.179810 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-console-config" (OuterVolumeSpecName: "console-config") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.179895 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.179832 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.180225 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-service-ca" (OuterVolumeSpecName: "service-ca") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.184664 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.185286 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9a546c-882a-4c69-b739-d682512dd39e-kube-api-access-hjnvs" (OuterVolumeSpecName: "kube-api-access-hjnvs") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "kube-api-access-hjnvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.185542 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8e9a546c-882a-4c69-b739-d682512dd39e" (UID: "8e9a546c-882a-4c69-b739-d682512dd39e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281244 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-console-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281282 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281294 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e9a546c-882a-4c69-b739-d682512dd39e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281309 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281323 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281334 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e9a546c-882a-4c69-b739-d682512dd39e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.281383 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjnvs\" (UniqueName: \"kubernetes.io/projected/8e9a546c-882a-4c69-b739-d682512dd39e-kube-api-access-hjnvs\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.740734 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75f75bdd6-kzlkk_8e9a546c-882a-4c69-b739-d682512dd39e/console/0.log" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.741434 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75f75bdd6-kzlkk" event={"ID":"8e9a546c-882a-4c69-b739-d682512dd39e","Type":"ContainerDied","Data":"635cbce9ed95f147e57a14482c77e3215fe350644e187f919690321ccdfeb1ac"} Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.741473 4740 scope.go:117] "RemoveContainer" containerID="df1063e86f8d755dfa3e8556f20d16100e784c7c0c12876cfe4c1c7e48d828eb" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.741539 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75f75bdd6-kzlkk" Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.792463 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75f75bdd6-kzlkk"] Jan 05 14:02:43 crc kubenswrapper[4740]: I0105 14:02:43.797662 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75f75bdd6-kzlkk"] Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.558171 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5qd7x"] Jan 05 14:02:44 crc kubenswrapper[4740]: E0105 14:02:44.558929 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9a546c-882a-4c69-b739-d682512dd39e" containerName="console" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.558957 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9a546c-882a-4c69-b739-d682512dd39e" containerName="console" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.559278 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9a546c-882a-4c69-b739-d682512dd39e" containerName="console" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.561202 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.568101 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qd7x"] Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.608208 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-utilities\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.608281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-catalog-content\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.608374 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-872wj\" (UniqueName: \"kubernetes.io/projected/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-kube-api-access-872wj\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.710136 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-872wj\" (UniqueName: \"kubernetes.io/projected/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-kube-api-access-872wj\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.710273 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-utilities\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.710297 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-catalog-content\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.710987 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-utilities\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.711020 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-catalog-content\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.731395 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-872wj\" (UniqueName: \"kubernetes.io/projected/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-kube-api-access-872wj\") pod \"certified-operators-5qd7x\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.889241 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:44 crc kubenswrapper[4740]: I0105 14:02:44.980805 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9a546c-882a-4c69-b739-d682512dd39e" path="/var/lib/kubelet/pods/8e9a546c-882a-4c69-b739-d682512dd39e/volumes" Jan 05 14:02:45 crc kubenswrapper[4740]: I0105 14:02:45.321759 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:45 crc kubenswrapper[4740]: I0105 14:02:45.322077 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:45 crc kubenswrapper[4740]: I0105 14:02:45.366708 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:45 crc kubenswrapper[4740]: I0105 14:02:45.380168 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5qd7x"] Jan 05 14:02:45 crc kubenswrapper[4740]: I0105 14:02:45.757542 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qd7x" event={"ID":"2a5c77d7-201f-4d04-9b26-b15c3e1905ab","Type":"ContainerStarted","Data":"3737f25d095c2c295228484f69ae24136a8b3cfcc7f5073b5b4c3fd816061e07"} Jan 05 14:02:45 crc kubenswrapper[4740]: I0105 14:02:45.834487 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:46 crc kubenswrapper[4740]: I0105 14:02:46.767158 4740 generic.go:334] "Generic (PLEG): container finished" podID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerID="6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77" exitCode=0 Jan 05 14:02:46 crc kubenswrapper[4740]: I0105 14:02:46.767241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qd7x" event={"ID":"2a5c77d7-201f-4d04-9b26-b15c3e1905ab","Type":"ContainerDied","Data":"6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77"} Jan 05 14:02:47 crc kubenswrapper[4740]: I0105 14:02:47.728701 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgbrz"] Jan 05 14:02:48 crc kubenswrapper[4740]: I0105 14:02:48.782322 4740 generic.go:334] "Generic (PLEG): container finished" podID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerID="7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb" exitCode=0 Jan 05 14:02:48 crc kubenswrapper[4740]: I0105 14:02:48.782367 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qd7x" event={"ID":"2a5c77d7-201f-4d04-9b26-b15c3e1905ab","Type":"ContainerDied","Data":"7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb"} Jan 05 14:02:48 crc kubenswrapper[4740]: I0105 14:02:48.782881 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fgbrz" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="registry-server" containerID="cri-o://bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07" gracePeriod=2 Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.202927 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.304836 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qg8w\" (UniqueName: \"kubernetes.io/projected/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-kube-api-access-7qg8w\") pod \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.305184 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-catalog-content\") pod \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.305275 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-utilities\") pod \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\" (UID: \"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66\") " Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.305964 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-utilities" (OuterVolumeSpecName: "utilities") pod "fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" (UID: "fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.314365 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-kube-api-access-7qg8w" (OuterVolumeSpecName: "kube-api-access-7qg8w") pod "fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" (UID: "fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66"). InnerVolumeSpecName "kube-api-access-7qg8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.348278 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" (UID: "fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.406659 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qg8w\" (UniqueName: \"kubernetes.io/projected/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-kube-api-access-7qg8w\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.406686 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.406695 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.791321 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qd7x" event={"ID":"2a5c77d7-201f-4d04-9b26-b15c3e1905ab","Type":"ContainerStarted","Data":"9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9"} Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.793386 4740 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerID="bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07" exitCode=0 Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.793425 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgbrz" event={"ID":"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66","Type":"ContainerDied","Data":"bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07"} Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.793439 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgbrz" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.793449 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgbrz" event={"ID":"fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66","Type":"ContainerDied","Data":"e27b9ba30a927676812bb0e6be4796f2f1a7412ff4aab8a3c838f56253ccea24"} Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.793467 4740 scope.go:117] "RemoveContainer" containerID="bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.807574 4740 scope.go:117] "RemoveContainer" containerID="6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.820464 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5qd7x" podStartSLOduration=3.126618936 podStartE2EDuration="5.820448974s" podCreationTimestamp="2026-01-05 14:02:44 +0000 UTC" firstStartedPulling="2026-01-05 14:02:46.771647759 +0000 UTC m=+816.078556378" lastFinishedPulling="2026-01-05 14:02:49.465477837 +0000 UTC m=+818.772386416" observedRunningTime="2026-01-05 14:02:49.818620526 +0000 UTC m=+819.125529105" watchObservedRunningTime="2026-01-05 14:02:49.820448974 +0000 UTC m=+819.127357553" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.834730 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgbrz"] Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.837623 4740 scope.go:117] "RemoveContainer" containerID="013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.840611 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fgbrz"] Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.851627 4740 scope.go:117] "RemoveContainer" containerID="bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07" Jan 05 14:02:49 crc kubenswrapper[4740]: E0105 14:02:49.852009 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07\": container with ID starting with bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07 not found: ID does not exist" containerID="bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.852047 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07"} err="failed to get container status \"bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07\": rpc error: code = NotFound desc = could not find container \"bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07\": container with ID starting with bd6608b91395c93e5f73dcebd7fbb370577c0cfc9eee1162d1a6b07bb256ea07 not found: ID does not exist" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.852086 4740 scope.go:117] "RemoveContainer" containerID="6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b" Jan 05 14:02:49 crc kubenswrapper[4740]: E0105 14:02:49.852350 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b\": container with ID starting with 6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b not found: ID does not exist" containerID="6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.852383 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b"} err="failed to get container status \"6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b\": rpc error: code = NotFound desc = could not find container \"6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b\": container with ID starting with 6e98dfc33dff94ce624b5d61b29353316134d716073a2147f24882c2d692514b not found: ID does not exist" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.852405 4740 scope.go:117] "RemoveContainer" containerID="013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26" Jan 05 14:02:49 crc kubenswrapper[4740]: E0105 14:02:49.854103 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26\": container with ID starting with 013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26 not found: ID does not exist" containerID="013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26" Jan 05 14:02:49 crc kubenswrapper[4740]: I0105 14:02:49.854133 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26"} err="failed to get container status \"013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26\": rpc error: code = NotFound desc = could not find container \"013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26\": container with ID starting with 013c6ee4d42e5d1788e6ddf9983cc46b59a6ae491ddb9af367a22a54fa4abe26 not found: ID does not exist" Jan 05 14:02:50 crc kubenswrapper[4740]: I0105 14:02:50.983507 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" path="/var/lib/kubelet/pods/fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66/volumes" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.400696 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84"] Jan 05 14:02:51 crc kubenswrapper[4740]: E0105 14:02:51.401020 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="registry-server" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.401039 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="registry-server" Jan 05 14:02:51 crc kubenswrapper[4740]: E0105 14:02:51.401093 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="extract-content" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.401102 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="extract-content" Jan 05 14:02:51 crc kubenswrapper[4740]: E0105 14:02:51.401119 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="extract-utilities" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.401128 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="extract-utilities" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.401301 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf0a63f-4b55-4f96-8a6e-86fa09f8ca66" containerName="registry-server" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.402564 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.407082 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.435355 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84"] Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.555907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.556158 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lm92\" (UniqueName: \"kubernetes.io/projected/c471fc2a-21ef-4027-b9ce-5d335f5735f2-kube-api-access-2lm92\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.556354 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.658411 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.658500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.658541 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lm92\" (UniqueName: \"kubernetes.io/projected/c471fc2a-21ef-4027-b9ce-5d335f5735f2-kube-api-access-2lm92\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.658964 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.659149 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.697676 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lm92\" (UniqueName: \"kubernetes.io/projected/c471fc2a-21ef-4027-b9ce-5d335f5735f2-kube-api-access-2lm92\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:51 crc kubenswrapper[4740]: I0105 14:02:51.719093 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:52 crc kubenswrapper[4740]: I0105 14:02:52.137359 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84"] Jan 05 14:02:52 crc kubenswrapper[4740]: W0105 14:02:52.150118 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc471fc2a_21ef_4027_b9ce_5d335f5735f2.slice/crio-7fb4b48c03588bf656bc512e1dc26fb3f9fed980bcb3e34410c2600455f1147f WatchSource:0}: Error finding container 7fb4b48c03588bf656bc512e1dc26fb3f9fed980bcb3e34410c2600455f1147f: Status 404 returned error can't find the container with id 7fb4b48c03588bf656bc512e1dc26fb3f9fed980bcb3e34410c2600455f1147f Jan 05 14:02:52 crc kubenswrapper[4740]: I0105 14:02:52.827618 4740 generic.go:334] "Generic (PLEG): container finished" podID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerID="15526d0d33baeffcbc06928f7778432f5b26e3652b024d6ac7183dd844ab2e12" exitCode=0 Jan 05 14:02:52 crc kubenswrapper[4740]: I0105 14:02:52.827704 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" event={"ID":"c471fc2a-21ef-4027-b9ce-5d335f5735f2","Type":"ContainerDied","Data":"15526d0d33baeffcbc06928f7778432f5b26e3652b024d6ac7183dd844ab2e12"} Jan 05 14:02:52 crc kubenswrapper[4740]: I0105 14:02:52.828535 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" event={"ID":"c471fc2a-21ef-4027-b9ce-5d335f5735f2","Type":"ContainerStarted","Data":"7fb4b48c03588bf656bc512e1dc26fb3f9fed980bcb3e34410c2600455f1147f"} Jan 05 14:02:54 crc kubenswrapper[4740]: I0105 14:02:54.852843 4740 generic.go:334] "Generic (PLEG): container finished" podID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerID="b42fa0107e4dabd71cd7762ffc6c870a475f57ffac6ef8107579da09c7eeeb7e" exitCode=0 Jan 05 14:02:54 crc kubenswrapper[4740]: I0105 14:02:54.852918 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" event={"ID":"c471fc2a-21ef-4027-b9ce-5d335f5735f2","Type":"ContainerDied","Data":"b42fa0107e4dabd71cd7762ffc6c870a475f57ffac6ef8107579da09c7eeeb7e"} Jan 05 14:02:54 crc kubenswrapper[4740]: I0105 14:02:54.889791 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:54 crc kubenswrapper[4740]: I0105 14:02:54.889845 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:54 crc kubenswrapper[4740]: I0105 14:02:54.932741 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:55 crc kubenswrapper[4740]: I0105 14:02:55.863943 4740 generic.go:334] "Generic (PLEG): container finished" podID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerID="4069160db518583ffdf087eee6dc5e898238ac7ee1aab88d3312d639511ad777" exitCode=0 Jan 05 14:02:55 crc kubenswrapper[4740]: I0105 14:02:55.864056 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" event={"ID":"c471fc2a-21ef-4027-b9ce-5d335f5735f2","Type":"ContainerDied","Data":"4069160db518583ffdf087eee6dc5e898238ac7ee1aab88d3312d639511ad777"} Jan 05 14:02:55 crc kubenswrapper[4740]: I0105 14:02:55.909681 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.207983 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.361170 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-util\") pod \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.361530 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-bundle\") pod \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.361617 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lm92\" (UniqueName: \"kubernetes.io/projected/c471fc2a-21ef-4027-b9ce-5d335f5735f2-kube-api-access-2lm92\") pod \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\" (UID: \"c471fc2a-21ef-4027-b9ce-5d335f5735f2\") " Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.364286 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-bundle" (OuterVolumeSpecName: "bundle") pod "c471fc2a-21ef-4027-b9ce-5d335f5735f2" (UID: "c471fc2a-21ef-4027-b9ce-5d335f5735f2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.366396 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c471fc2a-21ef-4027-b9ce-5d335f5735f2-kube-api-access-2lm92" (OuterVolumeSpecName: "kube-api-access-2lm92") pod "c471fc2a-21ef-4027-b9ce-5d335f5735f2" (UID: "c471fc2a-21ef-4027-b9ce-5d335f5735f2"). InnerVolumeSpecName "kube-api-access-2lm92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.391867 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-util" (OuterVolumeSpecName: "util") pod "c471fc2a-21ef-4027-b9ce-5d335f5735f2" (UID: "c471fc2a-21ef-4027-b9ce-5d335f5735f2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.463039 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-util\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.463313 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c471fc2a-21ef-4027-b9ce-5d335f5735f2-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.463394 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lm92\" (UniqueName: \"kubernetes.io/projected/c471fc2a-21ef-4027-b9ce-5d335f5735f2-kube-api-access-2lm92\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.880737 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" event={"ID":"c471fc2a-21ef-4027-b9ce-5d335f5735f2","Type":"ContainerDied","Data":"7fb4b48c03588bf656bc512e1dc26fb3f9fed980bcb3e34410c2600455f1147f"} Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.880793 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fb4b48c03588bf656bc512e1dc26fb3f9fed980bcb3e34410c2600455f1147f" Jan 05 14:02:57 crc kubenswrapper[4740]: I0105 14:02:57.880801 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84" Jan 05 14:02:58 crc kubenswrapper[4740]: I0105 14:02:58.539649 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qd7x"] Jan 05 14:02:58 crc kubenswrapper[4740]: I0105 14:02:58.541391 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5qd7x" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="registry-server" containerID="cri-o://9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9" gracePeriod=2 Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.552024 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.717046 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-872wj\" (UniqueName: \"kubernetes.io/projected/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-kube-api-access-872wj\") pod \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.717162 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-catalog-content\") pod \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.717309 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-utilities\") pod \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\" (UID: \"2a5c77d7-201f-4d04-9b26-b15c3e1905ab\") " Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.719294 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-utilities" (OuterVolumeSpecName: "utilities") pod "2a5c77d7-201f-4d04-9b26-b15c3e1905ab" (UID: "2a5c77d7-201f-4d04-9b26-b15c3e1905ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.725381 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-kube-api-access-872wj" (OuterVolumeSpecName: "kube-api-access-872wj") pod "2a5c77d7-201f-4d04-9b26-b15c3e1905ab" (UID: "2a5c77d7-201f-4d04-9b26-b15c3e1905ab"). InnerVolumeSpecName "kube-api-access-872wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.806297 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a5c77d7-201f-4d04-9b26-b15c3e1905ab" (UID: "2a5c77d7-201f-4d04-9b26-b15c3e1905ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.820925 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.820964 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-872wj\" (UniqueName: \"kubernetes.io/projected/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-kube-api-access-872wj\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.820982 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c77d7-201f-4d04-9b26-b15c3e1905ab-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.903114 4740 generic.go:334] "Generic (PLEG): container finished" podID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerID="9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9" exitCode=0 Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.903203 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qd7x" event={"ID":"2a5c77d7-201f-4d04-9b26-b15c3e1905ab","Type":"ContainerDied","Data":"9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9"} Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.903247 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5qd7x" event={"ID":"2a5c77d7-201f-4d04-9b26-b15c3e1905ab","Type":"ContainerDied","Data":"3737f25d095c2c295228484f69ae24136a8b3cfcc7f5073b5b4c3fd816061e07"} Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.903280 4740 scope.go:117] "RemoveContainer" containerID="9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.903356 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5qd7x" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.945113 4740 scope.go:117] "RemoveContainer" containerID="7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.960147 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5qd7x"] Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.974059 4740 scope.go:117] "RemoveContainer" containerID="6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77" Jan 05 14:02:59 crc kubenswrapper[4740]: I0105 14:02:59.974424 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5qd7x"] Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.009370 4740 scope.go:117] "RemoveContainer" containerID="9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9" Jan 05 14:03:00 crc kubenswrapper[4740]: E0105 14:03:00.009821 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9\": container with ID starting with 9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9 not found: ID does not exist" containerID="9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9" Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.009872 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9"} err="failed to get container status \"9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9\": rpc error: code = NotFound desc = could not find container \"9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9\": container with ID starting with 9e8dd981717da002db76d5fe4924fcc8841ba9a293e108865d12f78e1ce1b8b9 not found: ID does not exist" Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.009906 4740 scope.go:117] "RemoveContainer" containerID="7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb" Jan 05 14:03:00 crc kubenswrapper[4740]: E0105 14:03:00.011212 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb\": container with ID starting with 7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb not found: ID does not exist" containerID="7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb" Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.011246 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb"} err="failed to get container status \"7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb\": rpc error: code = NotFound desc = could not find container \"7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb\": container with ID starting with 7bee8f00a8334b9c297bea1cd38ad8a8d41fc957e91489e523019a420fc006eb not found: ID does not exist" Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.011272 4740 scope.go:117] "RemoveContainer" containerID="6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77" Jan 05 14:03:00 crc kubenswrapper[4740]: E0105 14:03:00.011767 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77\": container with ID starting with 6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77 not found: ID does not exist" containerID="6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77" Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.011785 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77"} err="failed to get container status \"6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77\": rpc error: code = NotFound desc = could not find container \"6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77\": container with ID starting with 6362d25d804a317d99a714d9781c5166d8cd8944d1899dae1bee89a13324ff77 not found: ID does not exist" Jan 05 14:03:00 crc kubenswrapper[4740]: I0105 14:03:00.984188 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" path="/var/lib/kubelet/pods/2a5c77d7-201f-4d04-9b26-b15c3e1905ab/volumes" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.916157 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.916444 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.939916 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9rq82"] Jan 05 14:03:01 crc kubenswrapper[4740]: E0105 14:03:01.940256 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="extract-utilities" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940273 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="extract-utilities" Jan 05 14:03:01 crc kubenswrapper[4740]: E0105 14:03:01.940331 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="registry-server" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940342 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="registry-server" Jan 05 14:03:01 crc kubenswrapper[4740]: E0105 14:03:01.940364 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="extract-content" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940372 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="extract-content" Jan 05 14:03:01 crc kubenswrapper[4740]: E0105 14:03:01.940381 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="extract" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940387 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="extract" Jan 05 14:03:01 crc kubenswrapper[4740]: E0105 14:03:01.940408 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="pull" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940415 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="pull" Jan 05 14:03:01 crc kubenswrapper[4740]: E0105 14:03:01.940427 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="util" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940434 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="util" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940577 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a5c77d7-201f-4d04-9b26-b15c3e1905ab" containerName="registry-server" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.940587 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c471fc2a-21ef-4027-b9ce-5d335f5735f2" containerName="extract" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.941657 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:01 crc kubenswrapper[4740]: I0105 14:03:01.961890 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rq82"] Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.060218 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-utilities\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.060269 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thd6m\" (UniqueName: \"kubernetes.io/projected/07652c17-1df0-4cd0-ad86-9e1afcf38079-kube-api-access-thd6m\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.060327 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-catalog-content\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.162092 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thd6m\" (UniqueName: \"kubernetes.io/projected/07652c17-1df0-4cd0-ad86-9e1afcf38079-kube-api-access-thd6m\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.162188 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-catalog-content\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.162366 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-utilities\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.162844 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-utilities\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.162949 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-catalog-content\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.182262 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thd6m\" (UniqueName: \"kubernetes.io/projected/07652c17-1df0-4cd0-ad86-9e1afcf38079-kube-api-access-thd6m\") pod \"redhat-marketplace-9rq82\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.258440 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.778354 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rq82"] Jan 05 14:03:02 crc kubenswrapper[4740]: I0105 14:03:02.932636 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rq82" event={"ID":"07652c17-1df0-4cd0-ad86-9e1afcf38079","Type":"ContainerStarted","Data":"dcd4f8dbad830e17ac8cb610d6dea08f66eea771201be1c6eacb882d3c27f70b"} Jan 05 14:03:03 crc kubenswrapper[4740]: I0105 14:03:03.941366 4740 generic.go:334] "Generic (PLEG): container finished" podID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerID="f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f" exitCode=0 Jan 05 14:03:03 crc kubenswrapper[4740]: I0105 14:03:03.941474 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rq82" event={"ID":"07652c17-1df0-4cd0-ad86-9e1afcf38079","Type":"ContainerDied","Data":"f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f"} Jan 05 14:03:05 crc kubenswrapper[4740]: I0105 14:03:05.957800 4740 generic.go:334] "Generic (PLEG): container finished" podID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerID="2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693" exitCode=0 Jan 05 14:03:05 crc kubenswrapper[4740]: I0105 14:03:05.957905 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rq82" event={"ID":"07652c17-1df0-4cd0-ad86-9e1afcf38079","Type":"ContainerDied","Data":"2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693"} Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.595211 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54"] Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.596931 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.598427 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.599588 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.599893 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.599978 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-wgpjn" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.600914 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.613100 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54"] Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.785911 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8de8b15c-984a-4c5f-a956-5f4244da97ef-apiservice-cert\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.785966 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8de8b15c-984a-4c5f-a956-5f4244da97ef-webhook-cert\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.786783 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gs7s\" (UniqueName: \"kubernetes.io/projected/8de8b15c-984a-4c5f-a956-5f4244da97ef-kube-api-access-5gs7s\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.838456 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79"] Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.840268 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.846903 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-sjk2v" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.846923 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.847095 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.872276 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79"] Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.887975 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8de8b15c-984a-4c5f-a956-5f4244da97ef-apiservice-cert\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.888280 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8de8b15c-984a-4c5f-a956-5f4244da97ef-webhook-cert\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.888377 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gs7s\" (UniqueName: \"kubernetes.io/projected/8de8b15c-984a-4c5f-a956-5f4244da97ef-kube-api-access-5gs7s\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.895654 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8de8b15c-984a-4c5f-a956-5f4244da97ef-webhook-cert\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.908729 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8de8b15c-984a-4c5f-a956-5f4244da97ef-apiservice-cert\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.937757 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gs7s\" (UniqueName: \"kubernetes.io/projected/8de8b15c-984a-4c5f-a956-5f4244da97ef-kube-api-access-5gs7s\") pod \"metallb-operator-controller-manager-65b6fb4bb9-pfr54\" (UID: \"8de8b15c-984a-4c5f-a956-5f4244da97ef\") " pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.989929 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-webhook-cert\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.991477 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-apiservice-cert\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.991620 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9mbv\" (UniqueName: \"kubernetes.io/projected/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-kube-api-access-c9mbv\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:07 crc kubenswrapper[4740]: I0105 14:03:07.993202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rq82" event={"ID":"07652c17-1df0-4cd0-ad86-9e1afcf38079","Type":"ContainerStarted","Data":"00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98"} Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.093414 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-webhook-cert\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.093513 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-apiservice-cert\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.093583 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9mbv\" (UniqueName: \"kubernetes.io/projected/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-kube-api-access-c9mbv\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.099817 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-apiservice-cert\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.100207 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-webhook-cert\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.109303 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9mbv\" (UniqueName: \"kubernetes.io/projected/939f9254-d7a5-48cb-8ab1-7ea2e4f68610-kube-api-access-c9mbv\") pod \"metallb-operator-webhook-server-d9475cc54-vwc79\" (UID: \"939f9254-d7a5-48cb-8ab1-7ea2e4f68610\") " pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.163630 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.210233 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.633036 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9rq82" podStartSLOduration=4.654065341 podStartE2EDuration="7.633013145s" podCreationTimestamp="2026-01-05 14:03:01 +0000 UTC" firstStartedPulling="2026-01-05 14:03:03.943444502 +0000 UTC m=+833.250353081" lastFinishedPulling="2026-01-05 14:03:06.922392306 +0000 UTC m=+836.229300885" observedRunningTime="2026-01-05 14:03:08.025348869 +0000 UTC m=+837.332257448" watchObservedRunningTime="2026-01-05 14:03:08.633013145 +0000 UTC m=+837.939921734" Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.637860 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79"] Jan 05 14:03:08 crc kubenswrapper[4740]: W0105 14:03:08.641637 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod939f9254_d7a5_48cb_8ab1_7ea2e4f68610.slice/crio-36f6b19216b0c638333dcc66ea59bb44c74f3155159a88654fd7236abb6824c3 WatchSource:0}: Error finding container 36f6b19216b0c638333dcc66ea59bb44c74f3155159a88654fd7236abb6824c3: Status 404 returned error can't find the container with id 36f6b19216b0c638333dcc66ea59bb44c74f3155159a88654fd7236abb6824c3 Jan 05 14:03:08 crc kubenswrapper[4740]: I0105 14:03:08.781942 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54"] Jan 05 14:03:09 crc kubenswrapper[4740]: I0105 14:03:09.000490 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" event={"ID":"939f9254-d7a5-48cb-8ab1-7ea2e4f68610","Type":"ContainerStarted","Data":"36f6b19216b0c638333dcc66ea59bb44c74f3155159a88654fd7236abb6824c3"} Jan 05 14:03:09 crc kubenswrapper[4740]: I0105 14:03:09.001591 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" event={"ID":"8de8b15c-984a-4c5f-a956-5f4244da97ef","Type":"ContainerStarted","Data":"47d2d5b530c197336479304589e4116eccd40cbc74fd861c9b14efaffb0f71bb"} Jan 05 14:03:12 crc kubenswrapper[4740]: I0105 14:03:12.259557 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:12 crc kubenswrapper[4740]: I0105 14:03:12.260744 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:12 crc kubenswrapper[4740]: I0105 14:03:12.302777 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:13 crc kubenswrapper[4740]: I0105 14:03:13.088802 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:14 crc kubenswrapper[4740]: I0105 14:03:14.529729 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rq82"] Jan 05 14:03:15 crc kubenswrapper[4740]: I0105 14:03:15.070411 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" event={"ID":"8de8b15c-984a-4c5f-a956-5f4244da97ef","Type":"ContainerStarted","Data":"4e010e6d4c05e7fecbae5de8840ebfc880df980fd67f13958f3d08bdc3ae1f30"} Jan 05 14:03:15 crc kubenswrapper[4740]: I0105 14:03:15.070573 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:15 crc kubenswrapper[4740]: I0105 14:03:15.072018 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" event={"ID":"939f9254-d7a5-48cb-8ab1-7ea2e4f68610","Type":"ContainerStarted","Data":"02dba798ce836469378a30015bc83abc00249bee63a8a9c082f291b3dda415a1"} Jan 05 14:03:15 crc kubenswrapper[4740]: I0105 14:03:15.096045 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" podStartSLOduration=2.772334771 podStartE2EDuration="8.096028032s" podCreationTimestamp="2026-01-05 14:03:07 +0000 UTC" firstStartedPulling="2026-01-05 14:03:08.788569777 +0000 UTC m=+838.095478356" lastFinishedPulling="2026-01-05 14:03:14.112263028 +0000 UTC m=+843.419171617" observedRunningTime="2026-01-05 14:03:15.091314217 +0000 UTC m=+844.398222806" watchObservedRunningTime="2026-01-05 14:03:15.096028032 +0000 UTC m=+844.402936621" Jan 05 14:03:15 crc kubenswrapper[4740]: I0105 14:03:15.127246 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podStartSLOduration=2.658368882 podStartE2EDuration="8.127223559s" podCreationTimestamp="2026-01-05 14:03:07 +0000 UTC" firstStartedPulling="2026-01-05 14:03:08.644831878 +0000 UTC m=+837.951740457" lastFinishedPulling="2026-01-05 14:03:14.113686555 +0000 UTC m=+843.420595134" observedRunningTime="2026-01-05 14:03:15.123657894 +0000 UTC m=+844.430566483" watchObservedRunningTime="2026-01-05 14:03:15.127223559 +0000 UTC m=+844.434132138" Jan 05 14:03:16 crc kubenswrapper[4740]: I0105 14:03:16.083645 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9rq82" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="registry-server" containerID="cri-o://00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98" gracePeriod=2 Jan 05 14:03:16 crc kubenswrapper[4740]: I0105 14:03:16.084208 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.088033 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.093905 4740 generic.go:334] "Generic (PLEG): container finished" podID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerID="00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98" exitCode=0 Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.094805 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rq82" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.094986 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rq82" event={"ID":"07652c17-1df0-4cd0-ad86-9e1afcf38079","Type":"ContainerDied","Data":"00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98"} Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.095082 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rq82" event={"ID":"07652c17-1df0-4cd0-ad86-9e1afcf38079","Type":"ContainerDied","Data":"dcd4f8dbad830e17ac8cb610d6dea08f66eea771201be1c6eacb882d3c27f70b"} Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.095106 4740 scope.go:117] "RemoveContainer" containerID="00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.122120 4740 scope.go:117] "RemoveContainer" containerID="2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.153236 4740 scope.go:117] "RemoveContainer" containerID="f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.244348 4740 scope.go:117] "RemoveContainer" containerID="00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98" Jan 05 14:03:17 crc kubenswrapper[4740]: E0105 14:03:17.245560 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98\": container with ID starting with 00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98 not found: ID does not exist" containerID="00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.245604 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98"} err="failed to get container status \"00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98\": rpc error: code = NotFound desc = could not find container \"00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98\": container with ID starting with 00b040511180961272dce0d4cedcd33400af1aa2efd5f87263cc0f48cf6d3b98 not found: ID does not exist" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.245626 4740 scope.go:117] "RemoveContainer" containerID="2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693" Jan 05 14:03:17 crc kubenswrapper[4740]: E0105 14:03:17.245948 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693\": container with ID starting with 2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693 not found: ID does not exist" containerID="2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.245968 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693"} err="failed to get container status \"2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693\": rpc error: code = NotFound desc = could not find container \"2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693\": container with ID starting with 2c22ffb726c79591abc426770a2362dbcd5c1aa60ae803472c38dc0f60338693 not found: ID does not exist" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.246001 4740 scope.go:117] "RemoveContainer" containerID="f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f" Jan 05 14:03:17 crc kubenswrapper[4740]: E0105 14:03:17.254531 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f\": container with ID starting with f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f not found: ID does not exist" containerID="f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.254573 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f"} err="failed to get container status \"f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f\": rpc error: code = NotFound desc = could not find container \"f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f\": container with ID starting with f61d366d0bad3c3cbdb9576707aa73f9c3cd20847594a86b688e11d3a326b15f not found: ID does not exist" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.254930 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-utilities\") pod \"07652c17-1df0-4cd0-ad86-9e1afcf38079\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.255833 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-utilities" (OuterVolumeSpecName: "utilities") pod "07652c17-1df0-4cd0-ad86-9e1afcf38079" (UID: "07652c17-1df0-4cd0-ad86-9e1afcf38079"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.255960 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-catalog-content\") pod \"07652c17-1df0-4cd0-ad86-9e1afcf38079\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.264755 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thd6m\" (UniqueName: \"kubernetes.io/projected/07652c17-1df0-4cd0-ad86-9e1afcf38079-kube-api-access-thd6m\") pod \"07652c17-1df0-4cd0-ad86-9e1afcf38079\" (UID: \"07652c17-1df0-4cd0-ad86-9e1afcf38079\") " Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.265158 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.274249 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07652c17-1df0-4cd0-ad86-9e1afcf38079-kube-api-access-thd6m" (OuterVolumeSpecName: "kube-api-access-thd6m") pod "07652c17-1df0-4cd0-ad86-9e1afcf38079" (UID: "07652c17-1df0-4cd0-ad86-9e1afcf38079"). InnerVolumeSpecName "kube-api-access-thd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.277518 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07652c17-1df0-4cd0-ad86-9e1afcf38079" (UID: "07652c17-1df0-4cd0-ad86-9e1afcf38079"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.366227 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thd6m\" (UniqueName: \"kubernetes.io/projected/07652c17-1df0-4cd0-ad86-9e1afcf38079-kube-api-access-thd6m\") on node \"crc\" DevicePath \"\"" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.366425 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07652c17-1df0-4cd0-ad86-9e1afcf38079-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.422446 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rq82"] Jan 05 14:03:17 crc kubenswrapper[4740]: I0105 14:03:17.428084 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rq82"] Jan 05 14:03:18 crc kubenswrapper[4740]: I0105 14:03:18.983060 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" path="/var/lib/kubelet/pods/07652c17-1df0-4cd0-ad86-9e1afcf38079/volumes" Jan 05 14:03:28 crc kubenswrapper[4740]: I0105 14:03:28.171816 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 14:03:31 crc kubenswrapper[4740]: I0105 14:03:31.916629 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:03:31 crc kubenswrapper[4740]: I0105 14:03:31.916975 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:03:31 crc kubenswrapper[4740]: I0105 14:03:31.917029 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:03:31 crc kubenswrapper[4740]: I0105 14:03:31.917847 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"615f01064ee45ac723f59788df185ab69f9b600b12c150fcc649dcf97daf611a"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:03:31 crc kubenswrapper[4740]: I0105 14:03:31.917906 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://615f01064ee45ac723f59788df185ab69f9b600b12c150fcc649dcf97daf611a" gracePeriod=600 Jan 05 14:03:32 crc kubenswrapper[4740]: E0105 14:03:32.177332 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7737db78_0989_433f_968a_7e5b441b7537.slice/crio-d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:03:32 crc kubenswrapper[4740]: I0105 14:03:32.216057 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="615f01064ee45ac723f59788df185ab69f9b600b12c150fcc649dcf97daf611a" exitCode=0 Jan 05 14:03:32 crc kubenswrapper[4740]: I0105 14:03:32.216127 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"615f01064ee45ac723f59788df185ab69f9b600b12c150fcc649dcf97daf611a"} Jan 05 14:03:32 crc kubenswrapper[4740]: I0105 14:03:32.216159 4740 scope.go:117] "RemoveContainer" containerID="c94fbb7c1e4a27a12915fd96b93743fa30aac1c7bed9369659cc71247bcbb496" Jan 05 14:03:33 crc kubenswrapper[4740]: I0105 14:03:33.225735 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d"} Jan 05 14:03:48 crc kubenswrapper[4740]: I0105 14:03:48.213048 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.047740 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7"] Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.048402 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="extract-content" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.048426 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="extract-content" Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.048443 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="extract-utilities" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.048451 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="extract-utilities" Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.048467 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="registry-server" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.048474 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="registry-server" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.048654 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="07652c17-1df0-4cd0-ad86-9e1afcf38079" containerName="registry-server" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.049325 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.051349 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-56d2m" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.052157 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.055462 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ffh7k"] Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.060434 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.061870 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7"] Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.062751 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.063430 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.132703 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-startup\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.132780 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-sockets\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.132801 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-reloader\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.132912 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-conf\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.132955 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afd48a06-614a-4fed-8629-a1a2eb83ab80-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qhkx7\" (UID: \"afd48a06-614a-4fed-8629-a1a2eb83ab80\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.133057 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkzk\" (UniqueName: \"kubernetes.io/projected/3c26b9da-cc4e-44dc-92ba-92e42b962010-kube-api-access-sdkzk\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.133181 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-metrics\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.133330 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c26b9da-cc4e-44dc-92ba-92e42b962010-metrics-certs\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.133361 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8qhg\" (UniqueName: \"kubernetes.io/projected/afd48a06-614a-4fed-8629-a1a2eb83ab80-kube-api-access-w8qhg\") pod \"frr-k8s-webhook-server-7784b6fcf-qhkx7\" (UID: \"afd48a06-614a-4fed-8629-a1a2eb83ab80\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.181022 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vp8lm"] Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.182211 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.184064 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.184294 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.184393 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.197264 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-pgtqj"] Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.199095 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.199425 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tz74p" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.200581 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.212587 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-pgtqj"] Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.234999 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkzk\" (UniqueName: \"kubernetes.io/projected/3c26b9da-cc4e-44dc-92ba-92e42b962010-kube-api-access-sdkzk\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235076 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-metrics\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235109 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ba42b606-5ea8-4d51-a695-dc563937f304-metallb-excludel2\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235132 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-metrics-certs\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235163 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c26b9da-cc4e-44dc-92ba-92e42b962010-metrics-certs\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235184 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8qhg\" (UniqueName: \"kubernetes.io/projected/afd48a06-614a-4fed-8629-a1a2eb83ab80-kube-api-access-w8qhg\") pod \"frr-k8s-webhook-server-7784b6fcf-qhkx7\" (UID: \"afd48a06-614a-4fed-8629-a1a2eb83ab80\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235200 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-startup\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235232 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-sockets\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235249 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-reloader\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235266 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp45k\" (UniqueName: \"kubernetes.io/projected/ba42b606-5ea8-4d51-a695-dc563937f304-kube-api-access-hp45k\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-conf\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235309 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afd48a06-614a-4fed-8629-a1a2eb83ab80-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qhkx7\" (UID: \"afd48a06-614a-4fed-8629-a1a2eb83ab80\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.235327 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.236125 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-sockets\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.236314 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-reloader\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.236429 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-startup\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.236489 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-frr-conf\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.236658 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c26b9da-cc4e-44dc-92ba-92e42b962010-metrics\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.244251 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afd48a06-614a-4fed-8629-a1a2eb83ab80-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qhkx7\" (UID: \"afd48a06-614a-4fed-8629-a1a2eb83ab80\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.254685 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkzk\" (UniqueName: \"kubernetes.io/projected/3c26b9da-cc4e-44dc-92ba-92e42b962010-kube-api-access-sdkzk\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.257583 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c26b9da-cc4e-44dc-92ba-92e42b962010-metrics-certs\") pod \"frr-k8s-ffh7k\" (UID: \"3c26b9da-cc4e-44dc-92ba-92e42b962010\") " pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.260676 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8qhg\" (UniqueName: \"kubernetes.io/projected/afd48a06-614a-4fed-8629-a1a2eb83ab80-kube-api-access-w8qhg\") pod \"frr-k8s-webhook-server-7784b6fcf-qhkx7\" (UID: \"afd48a06-614a-4fed-8629-a1a2eb83ab80\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337148 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ba42b606-5ea8-4d51-a695-dc563937f304-metallb-excludel2\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337197 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-metrics-certs\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337247 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22a24e17-6432-4f77-a553-47f9de4d68e4-cert\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337265 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22a24e17-6432-4f77-a553-47f9de4d68e4-metrics-certs\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337305 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp45k\" (UniqueName: \"kubernetes.io/projected/ba42b606-5ea8-4d51-a695-dc563937f304-kube-api-access-hp45k\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337334 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337366 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mv88\" (UniqueName: \"kubernetes.io/projected/22a24e17-6432-4f77-a553-47f9de4d68e4-kube-api-access-8mv88\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.337529 4740 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.337611 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-metrics-certs podName:ba42b606-5ea8-4d51-a695-dc563937f304 nodeName:}" failed. No retries permitted until 2026-01-05 14:03:49.837589286 +0000 UTC m=+879.144497865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-metrics-certs") pod "speaker-vp8lm" (UID: "ba42b606-5ea8-4d51-a695-dc563937f304") : secret "speaker-certs-secret" not found Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.337758 4740 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.337810 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist podName:ba42b606-5ea8-4d51-a695-dc563937f304 nodeName:}" failed. No retries permitted until 2026-01-05 14:03:49.837793902 +0000 UTC m=+879.144702471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist") pod "speaker-vp8lm" (UID: "ba42b606-5ea8-4d51-a695-dc563937f304") : secret "metallb-memberlist" not found Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.337986 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ba42b606-5ea8-4d51-a695-dc563937f304-metallb-excludel2\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.366241 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp45k\" (UniqueName: \"kubernetes.io/projected/ba42b606-5ea8-4d51-a695-dc563937f304-kube-api-access-hp45k\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.370196 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.377721 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.438629 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22a24e17-6432-4f77-a553-47f9de4d68e4-cert\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.438670 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22a24e17-6432-4f77-a553-47f9de4d68e4-metrics-certs\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.438757 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mv88\" (UniqueName: \"kubernetes.io/projected/22a24e17-6432-4f77-a553-47f9de4d68e4-kube-api-access-8mv88\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.440746 4740 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.447385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22a24e17-6432-4f77-a553-47f9de4d68e4-metrics-certs\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.455243 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22a24e17-6432-4f77-a553-47f9de4d68e4-cert\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.457657 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mv88\" (UniqueName: \"kubernetes.io/projected/22a24e17-6432-4f77-a553-47f9de4d68e4-kube-api-access-8mv88\") pod \"controller-5bddd4b946-pgtqj\" (UID: \"22a24e17-6432-4f77-a553-47f9de4d68e4\") " pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.521463 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.831730 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7"] Jan 05 14:03:49 crc kubenswrapper[4740]: W0105 14:03:49.840402 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd48a06_614a_4fed_8629_a1a2eb83ab80.slice/crio-d77ab802913b64d6cc5e9410ccf95d7f494b5738380adfe54483b12b2a49c3b5 WatchSource:0}: Error finding container d77ab802913b64d6cc5e9410ccf95d7f494b5738380adfe54483b12b2a49c3b5: Status 404 returned error can't find the container with id d77ab802913b64d6cc5e9410ccf95d7f494b5738380adfe54483b12b2a49c3b5 Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.844476 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.844638 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-metrics-certs\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.844968 4740 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 05 14:03:49 crc kubenswrapper[4740]: E0105 14:03:49.845060 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist podName:ba42b606-5ea8-4d51-a695-dc563937f304 nodeName:}" failed. No retries permitted until 2026-01-05 14:03:50.845038211 +0000 UTC m=+880.151946830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist") pod "speaker-vp8lm" (UID: "ba42b606-5ea8-4d51-a695-dc563937f304") : secret "metallb-memberlist" not found Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.849969 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-metrics-certs\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:49 crc kubenswrapper[4740]: I0105 14:03:49.921136 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-pgtqj"] Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.359358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" event={"ID":"afd48a06-614a-4fed-8629-a1a2eb83ab80","Type":"ContainerStarted","Data":"d77ab802913b64d6cc5e9410ccf95d7f494b5738380adfe54483b12b2a49c3b5"} Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.360472 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"4129743d0a9aac4c5ba2ec14b26cf91e9d53fa4ef38dc8443df571696b8b0620"} Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.361991 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-pgtqj" event={"ID":"22a24e17-6432-4f77-a553-47f9de4d68e4","Type":"ContainerStarted","Data":"2647995415fa782d04feaf1cd87de92649bd5078574fdd6bde7bd95e6dec2312"} Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.362014 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-pgtqj" event={"ID":"22a24e17-6432-4f77-a553-47f9de4d68e4","Type":"ContainerStarted","Data":"75caf9e297d57c480eecf8e52e912639d129ad8817b061dc0168cf2adce96ee5"} Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.362024 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-pgtqj" event={"ID":"22a24e17-6432-4f77-a553-47f9de4d68e4","Type":"ContainerStarted","Data":"c1788143467e890bae9d19b494ee53cf54abe916eec81d7b08372814864f3e1e"} Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.362220 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.381375 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-pgtqj" podStartSLOduration=1.381354228 podStartE2EDuration="1.381354228s" podCreationTimestamp="2026-01-05 14:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:03:50.375862132 +0000 UTC m=+879.682770711" watchObservedRunningTime="2026-01-05 14:03:50.381354228 +0000 UTC m=+879.688262817" Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.859872 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:50 crc kubenswrapper[4740]: I0105 14:03:50.870662 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ba42b606-5ea8-4d51-a695-dc563937f304-memberlist\") pod \"speaker-vp8lm\" (UID: \"ba42b606-5ea8-4d51-a695-dc563937f304\") " pod="metallb-system/speaker-vp8lm" Jan 05 14:03:51 crc kubenswrapper[4740]: I0105 14:03:51.003296 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vp8lm" Jan 05 14:03:51 crc kubenswrapper[4740]: I0105 14:03:51.371482 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vp8lm" event={"ID":"ba42b606-5ea8-4d51-a695-dc563937f304","Type":"ContainerStarted","Data":"559933fa7c1028a0e0d97f3615dd044bf6321af1614324b5ed95c6782dd85c30"} Jan 05 14:03:51 crc kubenswrapper[4740]: I0105 14:03:51.371830 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vp8lm" event={"ID":"ba42b606-5ea8-4d51-a695-dc563937f304","Type":"ContainerStarted","Data":"826895091ccf7f3cac52d1dd5aa7dc1cf1a734bff52d9c2ed393bf4e567c6acf"} Jan 05 14:03:52 crc kubenswrapper[4740]: I0105 14:03:52.382007 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vp8lm" event={"ID":"ba42b606-5ea8-4d51-a695-dc563937f304","Type":"ContainerStarted","Data":"cf546dcdf2f5e97d4c2a9d94166c91cf697cd2e304cbaccc428a712b9cbcf98c"} Jan 05 14:03:52 crc kubenswrapper[4740]: I0105 14:03:52.382238 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vp8lm" Jan 05 14:03:52 crc kubenswrapper[4740]: I0105 14:03:52.406617 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vp8lm" podStartSLOduration=3.406598916 podStartE2EDuration="3.406598916s" podCreationTimestamp="2026-01-05 14:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:03:52.398022277 +0000 UTC m=+881.704930856" watchObservedRunningTime="2026-01-05 14:03:52.406598916 +0000 UTC m=+881.713507495" Jan 05 14:03:57 crc kubenswrapper[4740]: I0105 14:03:57.442235 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" event={"ID":"afd48a06-614a-4fed-8629-a1a2eb83ab80","Type":"ContainerStarted","Data":"49f06f29e33c5e348a2da149a499c53a1185432ff82fc3000fe402e4826d6fb3"} Jan 05 14:03:57 crc kubenswrapper[4740]: I0105 14:03:57.442830 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:03:57 crc kubenswrapper[4740]: I0105 14:03:57.446186 4740 generic.go:334] "Generic (PLEG): container finished" podID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerID="a7ddd18417d98ca7fcf024d7798f334e79f41de04d7faab937144ded267e2541" exitCode=0 Jan 05 14:03:57 crc kubenswrapper[4740]: I0105 14:03:57.446235 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerDied","Data":"a7ddd18417d98ca7fcf024d7798f334e79f41de04d7faab937144ded267e2541"} Jan 05 14:03:57 crc kubenswrapper[4740]: I0105 14:03:57.496330 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podStartSLOduration=1.651942241 podStartE2EDuration="8.49631186s" podCreationTimestamp="2026-01-05 14:03:49 +0000 UTC" firstStartedPulling="2026-01-05 14:03:49.843629434 +0000 UTC m=+879.150538013" lastFinishedPulling="2026-01-05 14:03:56.687999053 +0000 UTC m=+885.994907632" observedRunningTime="2026-01-05 14:03:57.472214406 +0000 UTC m=+886.779122995" watchObservedRunningTime="2026-01-05 14:03:57.49631186 +0000 UTC m=+886.803220429" Jan 05 14:03:58 crc kubenswrapper[4740]: I0105 14:03:58.462217 4740 generic.go:334] "Generic (PLEG): container finished" podID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerID="120a582cf5447c846d0f655102e59c34240c4f85da5f37c79f8ce2c52272f081" exitCode=0 Jan 05 14:03:58 crc kubenswrapper[4740]: I0105 14:03:58.462328 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerDied","Data":"120a582cf5447c846d0f655102e59c34240c4f85da5f37c79f8ce2c52272f081"} Jan 05 14:03:59 crc kubenswrapper[4740]: I0105 14:03:59.487612 4740 generic.go:334] "Generic (PLEG): container finished" podID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerID="5e192df7c05e509b5510d8a2fb5484c1a04768d5afe22ba95528fd09fe42b71c" exitCode=0 Jan 05 14:03:59 crc kubenswrapper[4740]: I0105 14:03:59.487827 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerDied","Data":"5e192df7c05e509b5510d8a2fb5484c1a04768d5afe22ba95528fd09fe42b71c"} Jan 05 14:04:00 crc kubenswrapper[4740]: I0105 14:04:00.504196 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"d78242926e833037b58efaeb832020ffc87d460b388f386b2e09328fb1b04a5c"} Jan 05 14:04:00 crc kubenswrapper[4740]: I0105 14:04:00.504557 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"d4c32c0a9c3dd4bd2432c590e6d077d569584a9171fe7eda0730d0ffc938d3bb"} Jan 05 14:04:00 crc kubenswrapper[4740]: I0105 14:04:00.504575 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"e9cbc5b2b76dac75a477a645cfdc20161bb20a2f6ae0cf42bfb62042f7a39ab9"} Jan 05 14:04:00 crc kubenswrapper[4740]: I0105 14:04:00.504587 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"4e2777c6c82a22306634e0bbdfeea2142ebd7ad9f5d84eb0bf62f4df4fc1f1cb"} Jan 05 14:04:01 crc kubenswrapper[4740]: I0105 14:04:01.007397 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vp8lm" Jan 05 14:04:01 crc kubenswrapper[4740]: I0105 14:04:01.521142 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"c6ff61894759a5aa3ffb28002a8e0bb3af8f71e005a665753139f2abea245706"} Jan 05 14:04:01 crc kubenswrapper[4740]: I0105 14:04:01.521209 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"ad763bcdc0628278331ca0a746d88cdfffc8966e76d092253fe98da758e3c507"} Jan 05 14:04:01 crc kubenswrapper[4740]: I0105 14:04:01.522871 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:04:01 crc kubenswrapper[4740]: I0105 14:04:01.547521 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ffh7k" podStartSLOduration=5.390582099 podStartE2EDuration="12.547497374s" podCreationTimestamp="2026-01-05 14:03:49 +0000 UTC" firstStartedPulling="2026-01-05 14:03:49.564349338 +0000 UTC m=+878.871257917" lastFinishedPulling="2026-01-05 14:03:56.721264613 +0000 UTC m=+886.028173192" observedRunningTime="2026-01-05 14:04:01.544316978 +0000 UTC m=+890.851225587" watchObservedRunningTime="2026-01-05 14:04:01.547497374 +0000 UTC m=+890.854405993" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.147938 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-57228"] Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.149813 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.152408 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.152531 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.152426 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-htxtf" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.162803 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-57228"] Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.226429 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsqx5\" (UniqueName: \"kubernetes.io/projected/07d25a3e-6936-461d-9760-e1e543fdb62c-kube-api-access-dsqx5\") pod \"openstack-operator-index-57228\" (UID: \"07d25a3e-6936-461d-9760-e1e543fdb62c\") " pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.328593 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsqx5\" (UniqueName: \"kubernetes.io/projected/07d25a3e-6936-461d-9760-e1e543fdb62c-kube-api-access-dsqx5\") pod \"openstack-operator-index-57228\" (UID: \"07d25a3e-6936-461d-9760-e1e543fdb62c\") " pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.349847 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsqx5\" (UniqueName: \"kubernetes.io/projected/07d25a3e-6936-461d-9760-e1e543fdb62c-kube-api-access-dsqx5\") pod \"openstack-operator-index-57228\" (UID: \"07d25a3e-6936-461d-9760-e1e543fdb62c\") " pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.379353 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.415630 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.472127 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:04 crc kubenswrapper[4740]: I0105 14:04:04.919701 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-57228"] Jan 05 14:04:05 crc kubenswrapper[4740]: I0105 14:04:05.552198 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57228" event={"ID":"07d25a3e-6936-461d-9760-e1e543fdb62c","Type":"ContainerStarted","Data":"dbfb555125c96c151973f5f1c174556443494472b4124bd7d04bc9520cd420c1"} Jan 05 14:04:06 crc kubenswrapper[4740]: I0105 14:04:06.727173 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-57228"] Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.334053 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xlq6v"] Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.335015 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.353131 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xlq6v"] Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.383958 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjlr\" (UniqueName: \"kubernetes.io/projected/3a862b07-6296-43d8-8aff-2a6fdf1bd898-kube-api-access-9zjlr\") pod \"openstack-operator-index-xlq6v\" (UID: \"3a862b07-6296-43d8-8aff-2a6fdf1bd898\") " pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.485291 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjlr\" (UniqueName: \"kubernetes.io/projected/3a862b07-6296-43d8-8aff-2a6fdf1bd898-kube-api-access-9zjlr\") pod \"openstack-operator-index-xlq6v\" (UID: \"3a862b07-6296-43d8-8aff-2a6fdf1bd898\") " pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.506243 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjlr\" (UniqueName: \"kubernetes.io/projected/3a862b07-6296-43d8-8aff-2a6fdf1bd898-kube-api-access-9zjlr\") pod \"openstack-operator-index-xlq6v\" (UID: \"3a862b07-6296-43d8-8aff-2a6fdf1bd898\") " pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:07 crc kubenswrapper[4740]: I0105 14:04:07.656222 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.137196 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xlq6v"] Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.587399 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xlq6v" event={"ID":"3a862b07-6296-43d8-8aff-2a6fdf1bd898","Type":"ContainerStarted","Data":"d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e"} Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.587922 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xlq6v" event={"ID":"3a862b07-6296-43d8-8aff-2a6fdf1bd898","Type":"ContainerStarted","Data":"ba80222230777780657a6442355bb02bb1314aaf344693fdb3184b1432f78815"} Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.589756 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57228" event={"ID":"07d25a3e-6936-461d-9760-e1e543fdb62c","Type":"ContainerStarted","Data":"1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387"} Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.589909 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-57228" podUID="07d25a3e-6936-461d-9760-e1e543fdb62c" containerName="registry-server" containerID="cri-o://1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387" gracePeriod=2 Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.636773 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xlq6v" podStartSLOduration=1.546623929 podStartE2EDuration="1.636755279s" podCreationTimestamp="2026-01-05 14:04:07 +0000 UTC" firstStartedPulling="2026-01-05 14:04:08.142437485 +0000 UTC m=+897.449346074" lastFinishedPulling="2026-01-05 14:04:08.232568835 +0000 UTC m=+897.539477424" observedRunningTime="2026-01-05 14:04:08.629229178 +0000 UTC m=+897.936137767" watchObservedRunningTime="2026-01-05 14:04:08.636755279 +0000 UTC m=+897.943663868" Jan 05 14:04:08 crc kubenswrapper[4740]: I0105 14:04:08.662122 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-57228" podStartSLOduration=2.140597294 podStartE2EDuration="4.662093256s" podCreationTimestamp="2026-01-05 14:04:04 +0000 UTC" firstStartedPulling="2026-01-05 14:04:04.938806768 +0000 UTC m=+894.245715347" lastFinishedPulling="2026-01-05 14:04:07.46030273 +0000 UTC m=+896.767211309" observedRunningTime="2026-01-05 14:04:08.647837185 +0000 UTC m=+897.954745804" watchObservedRunningTime="2026-01-05 14:04:08.662093256 +0000 UTC m=+897.969001875" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.110833 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.127825 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsqx5\" (UniqueName: \"kubernetes.io/projected/07d25a3e-6936-461d-9760-e1e543fdb62c-kube-api-access-dsqx5\") pod \"07d25a3e-6936-461d-9760-e1e543fdb62c\" (UID: \"07d25a3e-6936-461d-9760-e1e543fdb62c\") " Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.133150 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d25a3e-6936-461d-9760-e1e543fdb62c-kube-api-access-dsqx5" (OuterVolumeSpecName: "kube-api-access-dsqx5") pod "07d25a3e-6936-461d-9760-e1e543fdb62c" (UID: "07d25a3e-6936-461d-9760-e1e543fdb62c"). InnerVolumeSpecName "kube-api-access-dsqx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.229690 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsqx5\" (UniqueName: \"kubernetes.io/projected/07d25a3e-6936-461d-9760-e1e543fdb62c-kube-api-access-dsqx5\") on node \"crc\" DevicePath \"\"" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.380139 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.380774 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ffh7k" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.526331 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-pgtqj" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.600817 4740 generic.go:334] "Generic (PLEG): container finished" podID="07d25a3e-6936-461d-9760-e1e543fdb62c" containerID="1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387" exitCode=0 Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.600909 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57228" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.600968 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57228" event={"ID":"07d25a3e-6936-461d-9760-e1e543fdb62c","Type":"ContainerDied","Data":"1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387"} Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.601008 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57228" event={"ID":"07d25a3e-6936-461d-9760-e1e543fdb62c","Type":"ContainerDied","Data":"dbfb555125c96c151973f5f1c174556443494472b4124bd7d04bc9520cd420c1"} Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.601037 4740 scope.go:117] "RemoveContainer" containerID="1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.630402 4740 scope.go:117] "RemoveContainer" containerID="1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387" Jan 05 14:04:09 crc kubenswrapper[4740]: E0105 14:04:09.632266 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387\": container with ID starting with 1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387 not found: ID does not exist" containerID="1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.632308 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387"} err="failed to get container status \"1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387\": rpc error: code = NotFound desc = could not find container \"1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387\": container with ID starting with 1a8da5fa2dc8216302e683eb0305dc66ab6bf3ff9886bfa2f9ceb33147dee387 not found: ID does not exist" Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.634408 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-57228"] Jan 05 14:04:09 crc kubenswrapper[4740]: I0105 14:04:09.639814 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-57228"] Jan 05 14:04:10 crc kubenswrapper[4740]: I0105 14:04:10.977930 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d25a3e-6936-461d-9760-e1e543fdb62c" path="/var/lib/kubelet/pods/07d25a3e-6936-461d-9760-e1e543fdb62c/volumes" Jan 05 14:04:17 crc kubenswrapper[4740]: I0105 14:04:17.658136 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:17 crc kubenswrapper[4740]: I0105 14:04:17.658703 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:17 crc kubenswrapper[4740]: I0105 14:04:17.706357 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:17 crc kubenswrapper[4740]: I0105 14:04:17.739756 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.766602 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw"] Jan 05 14:04:23 crc kubenswrapper[4740]: E0105 14:04:23.768827 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d25a3e-6936-461d-9760-e1e543fdb62c" containerName="registry-server" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.768917 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d25a3e-6936-461d-9760-e1e543fdb62c" containerName="registry-server" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.769164 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d25a3e-6936-461d-9760-e1e543fdb62c" containerName="registry-server" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.770404 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.771933 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jjphd" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.796641 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw"] Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.940609 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-bundle\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.940661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-util\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:23 crc kubenswrapper[4740]: I0105 14:04:23.940704 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptsc\" (UniqueName: \"kubernetes.io/projected/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-kube-api-access-bptsc\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.042517 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptsc\" (UniqueName: \"kubernetes.io/projected/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-kube-api-access-bptsc\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.042948 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-bundle\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.043108 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-util\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.043521 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-bundle\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.043569 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-util\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.069700 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptsc\" (UniqueName: \"kubernetes.io/projected/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-kube-api-access-bptsc\") pod \"10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.090169 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:24 crc kubenswrapper[4740]: I0105 14:04:24.513865 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw"] Jan 05 14:04:24 crc kubenswrapper[4740]: W0105 14:04:24.526723 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f093e0b_c437_4ab1_8e0d_ff7408f72e14.slice/crio-228dc36743143df41521211770567cf2ddd8e3a1b8c2e14914162a79cf184d08 WatchSource:0}: Error finding container 228dc36743143df41521211770567cf2ddd8e3a1b8c2e14914162a79cf184d08: Status 404 returned error can't find the container with id 228dc36743143df41521211770567cf2ddd8e3a1b8c2e14914162a79cf184d08 Jan 05 14:04:25 crc kubenswrapper[4740]: I0105 14:04:25.265768 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerID="1a3a33f5e56835f47465dd854806292b3afeb4d179502e0fd46316799c81061a" exitCode=0 Jan 05 14:04:25 crc kubenswrapper[4740]: I0105 14:04:25.265855 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" event={"ID":"5f093e0b-c437-4ab1-8e0d-ff7408f72e14","Type":"ContainerDied","Data":"1a3a33f5e56835f47465dd854806292b3afeb4d179502e0fd46316799c81061a"} Jan 05 14:04:25 crc kubenswrapper[4740]: I0105 14:04:25.266140 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" event={"ID":"5f093e0b-c437-4ab1-8e0d-ff7408f72e14","Type":"ContainerStarted","Data":"228dc36743143df41521211770567cf2ddd8e3a1b8c2e14914162a79cf184d08"} Jan 05 14:04:26 crc kubenswrapper[4740]: I0105 14:04:26.276265 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerID="4f47d35333184793ba7c3030aa7524e264da0a3e741a45c5a28435c2fa601456" exitCode=0 Jan 05 14:04:26 crc kubenswrapper[4740]: I0105 14:04:26.276310 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" event={"ID":"5f093e0b-c437-4ab1-8e0d-ff7408f72e14","Type":"ContainerDied","Data":"4f47d35333184793ba7c3030aa7524e264da0a3e741a45c5a28435c2fa601456"} Jan 05 14:04:27 crc kubenswrapper[4740]: I0105 14:04:27.291990 4740 generic.go:334] "Generic (PLEG): container finished" podID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerID="faff275d9a05a91a0e59c07e7c8ae8f8e62eb5f542576591cd5c958d8974b38b" exitCode=0 Jan 05 14:04:27 crc kubenswrapper[4740]: I0105 14:04:27.292192 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" event={"ID":"5f093e0b-c437-4ab1-8e0d-ff7408f72e14","Type":"ContainerDied","Data":"faff275d9a05a91a0e59c07e7c8ae8f8e62eb5f542576591cd5c958d8974b38b"} Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.716456 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.865399 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bptsc\" (UniqueName: \"kubernetes.io/projected/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-kube-api-access-bptsc\") pod \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.865497 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-bundle\") pod \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.865601 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-util\") pod \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\" (UID: \"5f093e0b-c437-4ab1-8e0d-ff7408f72e14\") " Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.866094 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-bundle" (OuterVolumeSpecName: "bundle") pod "5f093e0b-c437-4ab1-8e0d-ff7408f72e14" (UID: "5f093e0b-c437-4ab1-8e0d-ff7408f72e14"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.878145 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-util" (OuterVolumeSpecName: "util") pod "5f093e0b-c437-4ab1-8e0d-ff7408f72e14" (UID: "5f093e0b-c437-4ab1-8e0d-ff7408f72e14"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.885359 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-kube-api-access-bptsc" (OuterVolumeSpecName: "kube-api-access-bptsc") pod "5f093e0b-c437-4ab1-8e0d-ff7408f72e14" (UID: "5f093e0b-c437-4ab1-8e0d-ff7408f72e14"). InnerVolumeSpecName "kube-api-access-bptsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.967861 4740 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-util\") on node \"crc\" DevicePath \"\"" Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.967906 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bptsc\" (UniqueName: \"kubernetes.io/projected/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-kube-api-access-bptsc\") on node \"crc\" DevicePath \"\"" Jan 05 14:04:28 crc kubenswrapper[4740]: I0105 14:04:28.967923 4740 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f093e0b-c437-4ab1-8e0d-ff7408f72e14-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:04:29 crc kubenswrapper[4740]: I0105 14:04:29.313578 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" event={"ID":"5f093e0b-c437-4ab1-8e0d-ff7408f72e14","Type":"ContainerDied","Data":"228dc36743143df41521211770567cf2ddd8e3a1b8c2e14914162a79cf184d08"} Jan 05 14:04:29 crc kubenswrapper[4740]: I0105 14:04:29.313639 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228dc36743143df41521211770567cf2ddd8e3a1b8c2e14914162a79cf184d08" Jan 05 14:04:29 crc kubenswrapper[4740]: I0105 14:04:29.313699 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.471692 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx"] Jan 05 14:04:36 crc kubenswrapper[4740]: E0105 14:04:36.473230 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="extract" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.473324 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="extract" Jan 05 14:04:36 crc kubenswrapper[4740]: E0105 14:04:36.478719 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="pull" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.478892 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="pull" Jan 05 14:04:36 crc kubenswrapper[4740]: E0105 14:04:36.478966 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="util" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.479025 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="util" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.479353 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f093e0b-c437-4ab1-8e0d-ff7408f72e14" containerName="extract" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.479987 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.487109 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-wt2cd" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.509880 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7m2d\" (UniqueName: \"kubernetes.io/projected/97dce6b2-fc01-4ced-a77a-a506dcb06eff-kube-api-access-s7m2d\") pod \"openstack-operator-controller-operator-6d46c7d5f9-sm4zx\" (UID: \"97dce6b2-fc01-4ced-a77a-a506dcb06eff\") " pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.514504 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx"] Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.611162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7m2d\" (UniqueName: \"kubernetes.io/projected/97dce6b2-fc01-4ced-a77a-a506dcb06eff-kube-api-access-s7m2d\") pod \"openstack-operator-controller-operator-6d46c7d5f9-sm4zx\" (UID: \"97dce6b2-fc01-4ced-a77a-a506dcb06eff\") " pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.631162 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7m2d\" (UniqueName: \"kubernetes.io/projected/97dce6b2-fc01-4ced-a77a-a506dcb06eff-kube-api-access-s7m2d\") pod \"openstack-operator-controller-operator-6d46c7d5f9-sm4zx\" (UID: \"97dce6b2-fc01-4ced-a77a-a506dcb06eff\") " pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:04:36 crc kubenswrapper[4740]: I0105 14:04:36.798393 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:04:37 crc kubenswrapper[4740]: I0105 14:04:37.247192 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx"] Jan 05 14:04:37 crc kubenswrapper[4740]: W0105 14:04:37.259793 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97dce6b2_fc01_4ced_a77a_a506dcb06eff.slice/crio-bad9c56b0970f347ab851e280fe0f85614a97648198230862e1ec4d76f03b71f WatchSource:0}: Error finding container bad9c56b0970f347ab851e280fe0f85614a97648198230862e1ec4d76f03b71f: Status 404 returned error can't find the container with id bad9c56b0970f347ab851e280fe0f85614a97648198230862e1ec4d76f03b71f Jan 05 14:04:37 crc kubenswrapper[4740]: I0105 14:04:37.401364 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" event={"ID":"97dce6b2-fc01-4ced-a77a-a506dcb06eff","Type":"ContainerStarted","Data":"bad9c56b0970f347ab851e280fe0f85614a97648198230862e1ec4d76f03b71f"} Jan 05 14:04:42 crc kubenswrapper[4740]: I0105 14:04:42.447132 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" event={"ID":"97dce6b2-fc01-4ced-a77a-a506dcb06eff","Type":"ContainerStarted","Data":"87bdf933f30e2bf36501bb96d65829581584abcf7ac2fc3a979b2f57ece1f0dc"} Jan 05 14:04:42 crc kubenswrapper[4740]: I0105 14:04:42.447706 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:04:42 crc kubenswrapper[4740]: I0105 14:04:42.483953 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" podStartSLOduration=1.7091833950000002 podStartE2EDuration="6.483923289s" podCreationTimestamp="2026-01-05 14:04:36 +0000 UTC" firstStartedPulling="2026-01-05 14:04:37.265031032 +0000 UTC m=+926.571939621" lastFinishedPulling="2026-01-05 14:04:42.039770936 +0000 UTC m=+931.346679515" observedRunningTime="2026-01-05 14:04:42.477674872 +0000 UTC m=+931.784583501" watchObservedRunningTime="2026-01-05 14:04:42.483923289 +0000 UTC m=+931.790831878" Jan 05 14:04:56 crc kubenswrapper[4740]: I0105 14:04:56.802272 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.232419 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.233756 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.239838 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.240780 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.243531 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7sc8k" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.250640 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.252713 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ljqwp" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.258744 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.271215 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.272274 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.275144 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-lxspj" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.295880 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.324369 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.337383 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.341183 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tj5nk" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.372718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7467q\" (UniqueName: \"kubernetes.io/projected/a07332cc-11af-4d3a-8761-891417586bd1-kube-api-access-7467q\") pod \"cinder-operator-controller-manager-78979fc445-5dgxw\" (UID: \"a07332cc-11af-4d3a-8761-891417586bd1\") " pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.374074 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmln\" (UniqueName: \"kubernetes.io/projected/3868391b-95fe-40be-a77d-593ea72fd786-kube-api-access-gtmln\") pod \"barbican-operator-controller-manager-f6f74d6db-jqccs\" (UID: \"3868391b-95fe-40be-a77d-593ea72fd786\") " pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.374253 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrgl\" (UniqueName: \"kubernetes.io/projected/01f58b56-275e-432c-aecc-f9853194f0fd-kube-api-access-fcrgl\") pod \"designate-operator-controller-manager-66f8b87655-hfcj9\" (UID: \"01f58b56-275e-432c-aecc-f9853194f0fd\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.403391 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.431961 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.454703 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.468483 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mgwdk" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.493283 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.494221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrgl\" (UniqueName: \"kubernetes.io/projected/01f58b56-275e-432c-aecc-f9853194f0fd-kube-api-access-fcrgl\") pod \"designate-operator-controller-manager-66f8b87655-hfcj9\" (UID: \"01f58b56-275e-432c-aecc-f9853194f0fd\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.494278 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvnnb\" (UniqueName: \"kubernetes.io/projected/d8ffad98-ed22-4c4c-b0b8-234c3358089e-kube-api-access-hvnnb\") pod \"glance-operator-controller-manager-7b549fc966-7jlfq\" (UID: \"d8ffad98-ed22-4c4c-b0b8-234c3358089e\") " pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.494329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7467q\" (UniqueName: \"kubernetes.io/projected/a07332cc-11af-4d3a-8761-891417586bd1-kube-api-access-7467q\") pod \"cinder-operator-controller-manager-78979fc445-5dgxw\" (UID: \"a07332cc-11af-4d3a-8761-891417586bd1\") " pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.494360 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmln\" (UniqueName: \"kubernetes.io/projected/3868391b-95fe-40be-a77d-593ea72fd786-kube-api-access-gtmln\") pod \"barbican-operator-controller-manager-f6f74d6db-jqccs\" (UID: \"3868391b-95fe-40be-a77d-593ea72fd786\") " pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.507120 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.508230 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.525157 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.525672 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rzklw" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.541174 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.542618 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.545456 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7467q\" (UniqueName: \"kubernetes.io/projected/a07332cc-11af-4d3a-8761-891417586bd1-kube-api-access-7467q\") pod \"cinder-operator-controller-manager-78979fc445-5dgxw\" (UID: \"a07332cc-11af-4d3a-8761-891417586bd1\") " pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.546725 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrgl\" (UniqueName: \"kubernetes.io/projected/01f58b56-275e-432c-aecc-f9853194f0fd-kube-api-access-fcrgl\") pod \"designate-operator-controller-manager-66f8b87655-hfcj9\" (UID: \"01f58b56-275e-432c-aecc-f9853194f0fd\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.557997 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pkrdr" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.563725 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmln\" (UniqueName: \"kubernetes.io/projected/3868391b-95fe-40be-a77d-593ea72fd786-kube-api-access-gtmln\") pod \"barbican-operator-controller-manager-f6f74d6db-jqccs\" (UID: \"3868391b-95fe-40be-a77d-593ea72fd786\") " pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.571251 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.574573 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.575846 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.581129 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.585458 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lplp4" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602636 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602830 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvnnb\" (UniqueName: \"kubernetes.io/projected/d8ffad98-ed22-4c4c-b0b8-234c3358089e-kube-api-access-hvnnb\") pod \"glance-operator-controller-manager-7b549fc966-7jlfq\" (UID: \"d8ffad98-ed22-4c4c-b0b8-234c3358089e\") " pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602882 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn8td\" (UniqueName: \"kubernetes.io/projected/cb54fddc-8710-4066-908b-bb7a00a15c7e-kube-api-access-cn8td\") pod \"ironic-operator-controller-manager-f99f54bc8-9s2d8\" (UID: \"cb54fddc-8710-4066-908b-bb7a00a15c7e\") " pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602914 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4vlk\" (UniqueName: \"kubernetes.io/projected/e2dc84c3-c204-4f17-bcf3-418ab17b873d-kube-api-access-q4vlk\") pod \"heat-operator-controller-manager-658dd65b86-lz62l\" (UID: \"e2dc84c3-c204-4f17-bcf3-418ab17b873d\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602940 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqt9q\" (UniqueName: \"kubernetes.io/projected/61306334-5c80-4b48-8c47-bbc9a26f5ef3-kube-api-access-fqt9q\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-skl95\" (UID: \"61306334-5c80-4b48-8c47-bbc9a26f5ef3\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.602993 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbj2\" (UniqueName: \"kubernetes.io/projected/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-kube-api-access-ndbj2\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.603176 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.603604 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.615846 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-p49cd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.624700 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.632953 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.641898 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-598945d5b8-bn857"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.643008 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.644919 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-w44sm" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.652858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvnnb\" (UniqueName: \"kubernetes.io/projected/d8ffad98-ed22-4c4c-b0b8-234c3358089e-kube-api-access-hvnnb\") pod \"glance-operator-controller-manager-7b549fc966-7jlfq\" (UID: \"d8ffad98-ed22-4c4c-b0b8-234c3358089e\") " pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.666309 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.667631 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.670490 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ggtkh" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.678774 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.690633 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.700448 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-598945d5b8-bn857"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.706270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mp4n\" (UniqueName: \"kubernetes.io/projected/d4245835-8bf3-4491-9e66-a456d2fea83d-kube-api-access-8mp4n\") pod \"keystone-operator-controller-manager-568985c78-7sdkl\" (UID: \"d4245835-8bf3-4491-9e66-a456d2fea83d\") " pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.706435 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn8td\" (UniqueName: \"kubernetes.io/projected/cb54fddc-8710-4066-908b-bb7a00a15c7e-kube-api-access-cn8td\") pod \"ironic-operator-controller-manager-f99f54bc8-9s2d8\" (UID: \"cb54fddc-8710-4066-908b-bb7a00a15c7e\") " pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.706458 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8n7\" (UniqueName: \"kubernetes.io/projected/ca259c15-4c6d-4142-b257-12e805385d3f-kube-api-access-zr8n7\") pod \"manila-operator-controller-manager-598945d5b8-bn857\" (UID: \"ca259c15-4c6d-4142-b257-12e805385d3f\") " pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.706616 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4vlk\" (UniqueName: \"kubernetes.io/projected/e2dc84c3-c204-4f17-bcf3-418ab17b873d-kube-api-access-q4vlk\") pod \"heat-operator-controller-manager-658dd65b86-lz62l\" (UID: \"e2dc84c3-c204-4f17-bcf3-418ab17b873d\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.712282 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqt9q\" (UniqueName: \"kubernetes.io/projected/61306334-5c80-4b48-8c47-bbc9a26f5ef3-kube-api-access-fqt9q\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-skl95\" (UID: \"61306334-5c80-4b48-8c47-bbc9a26f5ef3\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.714400 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.714542 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:16 crc kubenswrapper[4740]: E0105 14:05:16.714728 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:16 crc kubenswrapper[4740]: E0105 14:05:16.714774 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert podName:3f0a6bbe-32b3-4e4d-afef-32e871616c6d nodeName:}" failed. No retries permitted until 2026-01-05 14:05:17.214756205 +0000 UTC m=+966.521664784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert") pod "infra-operator-controller-manager-6d99759cf-mglsd" (UID: "3f0a6bbe-32b3-4e4d-afef-32e871616c6d") : secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.715479 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbj2\" (UniqueName: \"kubernetes.io/projected/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-kube-api-access-ndbj2\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.716090 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm455\" (UniqueName: \"kubernetes.io/projected/b73e8f21-70b9-4f4a-b96b-f255e80db992-kube-api-access-bm455\") pod \"mariadb-operator-controller-manager-7b88bfc995-vqb8k\" (UID: \"b73e8f21-70b9-4f4a-b96b-f255e80db992\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.728285 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.730042 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.732353 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4qsrp" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.743299 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.748017 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbj2\" (UniqueName: \"kubernetes.io/projected/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-kube-api-access-ndbj2\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.748594 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4vlk\" (UniqueName: \"kubernetes.io/projected/e2dc84c3-c204-4f17-bcf3-418ab17b873d-kube-api-access-q4vlk\") pod \"heat-operator-controller-manager-658dd65b86-lz62l\" (UID: \"e2dc84c3-c204-4f17-bcf3-418ab17b873d\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.751697 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqt9q\" (UniqueName: \"kubernetes.io/projected/61306334-5c80-4b48-8c47-bbc9a26f5ef3-kube-api-access-fqt9q\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-skl95\" (UID: \"61306334-5c80-4b48-8c47-bbc9a26f5ef3\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.754412 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.755747 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.758393 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ksldt" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.763981 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.764939 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.770518 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rvmjb" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.773970 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn8td\" (UniqueName: \"kubernetes.io/projected/cb54fddc-8710-4066-908b-bb7a00a15c7e-kube-api-access-cn8td\") pod \"ironic-operator-controller-manager-f99f54bc8-9s2d8\" (UID: \"cb54fddc-8710-4066-908b-bb7a00a15c7e\") " pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.774020 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.779965 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.786333 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.793568 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.795466 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.799748 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-r8d8x" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.801868 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.804195 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.807158 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-brvgb" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.808326 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.808387 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.816753 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817430 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp558\" (UniqueName: \"kubernetes.io/projected/da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29-kube-api-access-qp558\") pod \"octavia-operator-controller-manager-68c649d9d-8tvrz\" (UID: \"da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817467 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817504 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprtl\" (UniqueName: \"kubernetes.io/projected/6faf05ee-49e0-4d3e-afcd-d11d9494da44-kube-api-access-cprtl\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817535 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8vwf\" (UniqueName: \"kubernetes.io/projected/71602a38-6096-4224-b49a-adfccfe02180-kube-api-access-s8vwf\") pod \"ovn-operator-controller-manager-bf6d4f946-8qngq\" (UID: \"71602a38-6096-4224-b49a-adfccfe02180\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817556 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm455\" (UniqueName: \"kubernetes.io/projected/b73e8f21-70b9-4f4a-b96b-f255e80db992-kube-api-access-bm455\") pod \"mariadb-operator-controller-manager-7b88bfc995-vqb8k\" (UID: \"b73e8f21-70b9-4f4a-b96b-f255e80db992\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mp4n\" (UniqueName: \"kubernetes.io/projected/d4245835-8bf3-4491-9e66-a456d2fea83d-kube-api-access-8mp4n\") pod \"keystone-operator-controller-manager-568985c78-7sdkl\" (UID: \"d4245835-8bf3-4491-9e66-a456d2fea83d\") " pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817630 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnhc\" (UniqueName: \"kubernetes.io/projected/35577830-7016-49ad-bae0-8a9962a2e82c-kube-api-access-8mnhc\") pod \"neutron-operator-controller-manager-7cd87b778f-szhmd\" (UID: \"35577830-7016-49ad-bae0-8a9962a2e82c\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817648 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8n7\" (UniqueName: \"kubernetes.io/projected/ca259c15-4c6d-4142-b257-12e805385d3f-kube-api-access-zr8n7\") pod \"manila-operator-controller-manager-598945d5b8-bn857\" (UID: \"ca259c15-4c6d-4142-b257-12e805385d3f\") " pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.817664 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6f8\" (UniqueName: \"kubernetes.io/projected/9d82f35e-307a-4ed2-89e0-0649e5300e41-kube-api-access-zb6f8\") pod \"nova-operator-controller-manager-5fbbf8b6cc-2hj8b\" (UID: \"9d82f35e-307a-4ed2-89e0-0649e5300e41\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.831708 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.850982 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mp4n\" (UniqueName: \"kubernetes.io/projected/d4245835-8bf3-4491-9e66-a456d2fea83d-kube-api-access-8mp4n\") pod \"keystone-operator-controller-manager-568985c78-7sdkl\" (UID: \"d4245835-8bf3-4491-9e66-a456d2fea83d\") " pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.853665 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.854733 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.855935 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm455\" (UniqueName: \"kubernetes.io/projected/b73e8f21-70b9-4f4a-b96b-f255e80db992-kube-api-access-bm455\") pod \"mariadb-operator-controller-manager-7b88bfc995-vqb8k\" (UID: \"b73e8f21-70b9-4f4a-b96b-f255e80db992\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.856709 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8n7\" (UniqueName: \"kubernetes.io/projected/ca259c15-4c6d-4142-b257-12e805385d3f-kube-api-access-zr8n7\") pod \"manila-operator-controller-manager-598945d5b8-bn857\" (UID: \"ca259c15-4c6d-4142-b257-12e805385d3f\") " pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.859898 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.862764 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qvbkm" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.907733 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919549 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w59xc\" (UniqueName: \"kubernetes.io/projected/ab509865-4e08-4927-b702-f28bfb553a27-kube-api-access-w59xc\") pod \"placement-operator-controller-manager-9b6f8f78c-6wdrm\" (UID: \"ab509865-4e08-4927-b702-f28bfb553a27\") " pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919590 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnhc\" (UniqueName: \"kubernetes.io/projected/35577830-7016-49ad-bae0-8a9962a2e82c-kube-api-access-8mnhc\") pod \"neutron-operator-controller-manager-7cd87b778f-szhmd\" (UID: \"35577830-7016-49ad-bae0-8a9962a2e82c\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919615 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6f8\" (UniqueName: \"kubernetes.io/projected/9d82f35e-307a-4ed2-89e0-0649e5300e41-kube-api-access-zb6f8\") pod \"nova-operator-controller-manager-5fbbf8b6cc-2hj8b\" (UID: \"9d82f35e-307a-4ed2-89e0-0649e5300e41\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919696 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp558\" (UniqueName: \"kubernetes.io/projected/da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29-kube-api-access-qp558\") pod \"octavia-operator-controller-manager-68c649d9d-8tvrz\" (UID: \"da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919738 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919790 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprtl\" (UniqueName: \"kubernetes.io/projected/6faf05ee-49e0-4d3e-afcd-d11d9494da44-kube-api-access-cprtl\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.919829 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8vwf\" (UniqueName: \"kubernetes.io/projected/71602a38-6096-4224-b49a-adfccfe02180-kube-api-access-s8vwf\") pod \"ovn-operator-controller-manager-bf6d4f946-8qngq\" (UID: \"71602a38-6096-4224-b49a-adfccfe02180\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:16 crc kubenswrapper[4740]: E0105 14:05:16.920539 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:16 crc kubenswrapper[4740]: E0105 14:05:16.920577 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert podName:6faf05ee-49e0-4d3e-afcd-d11d9494da44 nodeName:}" failed. No retries permitted until 2026-01-05 14:05:17.420562326 +0000 UTC m=+966.727470905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert") pod "openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" (UID: "6faf05ee-49e0-4d3e-afcd-d11d9494da44") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.923973 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.924978 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.929762 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kkj4x" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.943297 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8vwf\" (UniqueName: \"kubernetes.io/projected/71602a38-6096-4224-b49a-adfccfe02180-kube-api-access-s8vwf\") pod \"ovn-operator-controller-manager-bf6d4f946-8qngq\" (UID: \"71602a38-6096-4224-b49a-adfccfe02180\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.946082 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnhc\" (UniqueName: \"kubernetes.io/projected/35577830-7016-49ad-bae0-8a9962a2e82c-kube-api-access-8mnhc\") pod \"neutron-operator-controller-manager-7cd87b778f-szhmd\" (UID: \"35577830-7016-49ad-bae0-8a9962a2e82c\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.948259 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp558\" (UniqueName: \"kubernetes.io/projected/da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29-kube-api-access-qp558\") pod \"octavia-operator-controller-manager-68c649d9d-8tvrz\" (UID: \"da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.953891 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq"] Jan 05 14:05:16 crc kubenswrapper[4740]: I0105 14:05:16.969981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6f8\" (UniqueName: \"kubernetes.io/projected/9d82f35e-307a-4ed2-89e0-0649e5300e41-kube-api-access-zb6f8\") pod \"nova-operator-controller-manager-5fbbf8b6cc-2hj8b\" (UID: \"9d82f35e-307a-4ed2-89e0-0649e5300e41\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.011291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprtl\" (UniqueName: \"kubernetes.io/projected/6faf05ee-49e0-4d3e-afcd-d11d9494da44-kube-api-access-cprtl\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.024630 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6c7n\" (UniqueName: \"kubernetes.io/projected/d1228a5b-52ed-4d7e-940e-b4b03288fae5-kube-api-access-d6c7n\") pod \"swift-operator-controller-manager-bb586bbf4-vvlkq\" (UID: \"d1228a5b-52ed-4d7e-940e-b4b03288fae5\") " pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.024940 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w59xc\" (UniqueName: \"kubernetes.io/projected/ab509865-4e08-4927-b702-f28bfb553a27-kube-api-access-w59xc\") pod \"placement-operator-controller-manager-9b6f8f78c-6wdrm\" (UID: \"ab509865-4e08-4927-b702-f28bfb553a27\") " pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.038795 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.041385 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.066319 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.068224 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w59xc\" (UniqueName: \"kubernetes.io/projected/ab509865-4e08-4927-b702-f28bfb553a27-kube-api-access-w59xc\") pod \"placement-operator-controller-manager-9b6f8f78c-6wdrm\" (UID: \"ab509865-4e08-4927-b702-f28bfb553a27\") " pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.080524 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.080620 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.084534 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.086803 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4gscm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.094333 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.094591 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.095968 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.102300 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.103697 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-p2qvt" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.115309 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.129379 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45h5\" (UniqueName: \"kubernetes.io/projected/c7f964a1-5e19-4cb6-8e25-26fdc09410af-kube-api-access-n45h5\") pod \"telemetry-operator-controller-manager-7dc6b6df78-rkxtw\" (UID: \"c7f964a1-5e19-4cb6-8e25-26fdc09410af\") " pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.129436 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zq7\" (UniqueName: \"kubernetes.io/projected/c5e3ed99-183e-41f6-bbee-d5c8e7f629d1-kube-api-access-k6zq7\") pod \"test-operator-controller-manager-6c866cfdcb-jbqnp\" (UID: \"c5e3ed99-183e-41f6-bbee-d5c8e7f629d1\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.129500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6c7n\" (UniqueName: \"kubernetes.io/projected/d1228a5b-52ed-4d7e-940e-b4b03288fae5-kube-api-access-d6c7n\") pod \"swift-operator-controller-manager-bb586bbf4-vvlkq\" (UID: \"d1228a5b-52ed-4d7e-940e-b4b03288fae5\") " pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.131530 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.150127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6c7n\" (UniqueName: \"kubernetes.io/projected/d1228a5b-52ed-4d7e-940e-b4b03288fae5-kube-api-access-d6c7n\") pod \"swift-operator-controller-manager-bb586bbf4-vvlkq\" (UID: \"d1228a5b-52ed-4d7e-940e-b4b03288fae5\") " pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.168016 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.169473 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.169577 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.172329 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-m5lgv" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.187899 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.235382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.235417 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45h5\" (UniqueName: \"kubernetes.io/projected/c7f964a1-5e19-4cb6-8e25-26fdc09410af-kube-api-access-n45h5\") pod \"telemetry-operator-controller-manager-7dc6b6df78-rkxtw\" (UID: \"c7f964a1-5e19-4cb6-8e25-26fdc09410af\") " pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.235458 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zq7\" (UniqueName: \"kubernetes.io/projected/c5e3ed99-183e-41f6-bbee-d5c8e7f629d1-kube-api-access-k6zq7\") pod \"test-operator-controller-manager-6c866cfdcb-jbqnp\" (UID: \"c5e3ed99-183e-41f6-bbee-d5c8e7f629d1\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.235878 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.235913 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert podName:3f0a6bbe-32b3-4e4d-afef-32e871616c6d nodeName:}" failed. No retries permitted until 2026-01-05 14:05:18.235901666 +0000 UTC m=+967.542810245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert") pod "infra-operator-controller-manager-6d99759cf-mglsd" (UID: "3f0a6bbe-32b3-4e4d-afef-32e871616c6d") : secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.238752 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.243710 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.244920 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.250856 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.252300 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8z79g" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.252446 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.259665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45h5\" (UniqueName: \"kubernetes.io/projected/c7f964a1-5e19-4cb6-8e25-26fdc09410af-kube-api-access-n45h5\") pod \"telemetry-operator-controller-manager-7dc6b6df78-rkxtw\" (UID: \"c7f964a1-5e19-4cb6-8e25-26fdc09410af\") " pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.259900 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zq7\" (UniqueName: \"kubernetes.io/projected/c5e3ed99-183e-41f6-bbee-d5c8e7f629d1-kube-api-access-k6zq7\") pod \"test-operator-controller-manager-6c866cfdcb-jbqnp\" (UID: \"c5e3ed99-183e-41f6-bbee-d5c8e7f629d1\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.287206 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.299165 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.300334 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.311095 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-c29gs" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.314852 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.342924 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8j55\" (UniqueName: \"kubernetes.io/projected/49bbef73-8653-4747-93ee-35819a394b1f-kube-api-access-t8j55\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.342998 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.343072 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz5ht\" (UniqueName: \"kubernetes.io/projected/b8ae6035-6986-4e15-ac19-6e093c0a9e7a-kube-api-access-jz5ht\") pod \"watcher-operator-controller-manager-9dbdf6486-tfjlm\" (UID: \"b8ae6035-6986-4e15-ac19-6e093c0a9e7a\") " pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.343098 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.343122 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8922\" (UniqueName: \"kubernetes.io/projected/ed1081fd-322e-4e98-aa00-e5aeef21b7b3-kube-api-access-c8922\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6ppzp\" (UID: \"ed1081fd-322e-4e98-aa00-e5aeef21b7b3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.353701 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.358669 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.414462 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.434380 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.443986 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8j55\" (UniqueName: \"kubernetes.io/projected/49bbef73-8653-4747-93ee-35819a394b1f-kube-api-access-t8j55\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.444043 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.444099 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.444124 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz5ht\" (UniqueName: \"kubernetes.io/projected/b8ae6035-6986-4e15-ac19-6e093c0a9e7a-kube-api-access-jz5ht\") pod \"watcher-operator-controller-manager-9dbdf6486-tfjlm\" (UID: \"b8ae6035-6986-4e15-ac19-6e093c0a9e7a\") " pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.444150 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.444174 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8922\" (UniqueName: \"kubernetes.io/projected/ed1081fd-322e-4e98-aa00-e5aeef21b7b3-kube-api-access-c8922\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6ppzp\" (UID: \"ed1081fd-322e-4e98-aa00-e5aeef21b7b3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.444732 4740 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.444770 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:17.944756238 +0000 UTC m=+967.251664817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.444886 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.444909 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert podName:6faf05ee-49e0-4d3e-afcd-d11d9494da44 nodeName:}" failed. No retries permitted until 2026-01-05 14:05:18.444901392 +0000 UTC m=+967.751809971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert") pod "openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" (UID: "6faf05ee-49e0-4d3e-afcd-d11d9494da44") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.445083 4740 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.445105 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:17.945098337 +0000 UTC m=+967.252006916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "metrics-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.477001 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8j55\" (UniqueName: \"kubernetes.io/projected/49bbef73-8653-4747-93ee-35819a394b1f-kube-api-access-t8j55\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.479876 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz5ht\" (UniqueName: \"kubernetes.io/projected/b8ae6035-6986-4e15-ac19-6e093c0a9e7a-kube-api-access-jz5ht\") pod \"watcher-operator-controller-manager-9dbdf6486-tfjlm\" (UID: \"b8ae6035-6986-4e15-ac19-6e093c0a9e7a\") " pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.484403 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8922\" (UniqueName: \"kubernetes.io/projected/ed1081fd-322e-4e98-aa00-e5aeef21b7b3-kube-api-access-c8922\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6ppzp\" (UID: \"ed1081fd-322e-4e98-aa00-e5aeef21b7b3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.514873 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.518390 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.547247 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.642664 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.668004 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq"] Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.877810 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" event={"ID":"d8ffad98-ed22-4c4c-b0b8-234c3358089e","Type":"ContainerStarted","Data":"c030fcedb503d4821c41d999909ee862301d8d38a3a1e48482bd8bc66d6e7b01"} Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.884787 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" event={"ID":"a07332cc-11af-4d3a-8761-891417586bd1","Type":"ContainerStarted","Data":"2736b0b3ce6ae8f89c86fe9fc5439f01ad97d1b1e9674c7f73a80fb706ca4f7b"} Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.888763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" event={"ID":"01f58b56-275e-432c-aecc-f9853194f0fd","Type":"ContainerStarted","Data":"5d097a6aa520af35155a715571f30de1ce8d1213adcf029046c0bb6b204f559f"} Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.967249 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: I0105 14:05:17.967360 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.967431 4740 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.967501 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:18.967483772 +0000 UTC m=+968.274392351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "webhook-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.967611 4740 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 05 14:05:17 crc kubenswrapper[4740]: E0105 14:05:17.967676 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:18.967660266 +0000 UTC m=+968.274568845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "metrics-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.193340 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs"] Jan 05 14:05:18 crc kubenswrapper[4740]: W0105 14:05:18.214299 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3868391b_95fe_40be_a77d_593ea72fd786.slice/crio-37b103ef36d44d4280ef6e52811895fc53d8c0f9ef824af4a5a058a8abc52fe6 WatchSource:0}: Error finding container 37b103ef36d44d4280ef6e52811895fc53d8c0f9ef824af4a5a058a8abc52fe6: Status 404 returned error can't find the container with id 37b103ef36d44d4280ef6e52811895fc53d8c0f9ef824af4a5a058a8abc52fe6 Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.214899 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95"] Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.244423 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l"] Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.272430 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.272680 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.272726 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert podName:3f0a6bbe-32b3-4e4d-afef-32e871616c6d nodeName:}" failed. No retries permitted until 2026-01-05 14:05:20.272711601 +0000 UTC m=+969.579620180 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert") pod "infra-operator-controller-manager-6d99759cf-mglsd" (UID: "3f0a6bbe-32b3-4e4d-afef-32e871616c6d") : secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.475858 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.476172 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.476228 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert podName:6faf05ee-49e0-4d3e-afcd-d11d9494da44 nodeName:}" failed. No retries permitted until 2026-01-05 14:05:20.476212251 +0000 UTC m=+969.783120840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert") pod "openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" (UID: "6faf05ee-49e0-4d3e-afcd-d11d9494da44") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.500890 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl"] Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.514960 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8"] Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.525405 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-598945d5b8-bn857"] Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.539599 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k"] Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.905963 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" event={"ID":"3868391b-95fe-40be-a77d-593ea72fd786","Type":"ContainerStarted","Data":"37b103ef36d44d4280ef6e52811895fc53d8c0f9ef824af4a5a058a8abc52fe6"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.916234 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" event={"ID":"ca259c15-4c6d-4142-b257-12e805385d3f","Type":"ContainerStarted","Data":"553182aab500428036eebc6b40875d4947f2922cd8763212c7e6eb1a4e49ccc1"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.917951 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" event={"ID":"e2dc84c3-c204-4f17-bcf3-418ab17b873d","Type":"ContainerStarted","Data":"9f90ec48080a6ef5d0146f833184224b9ef131c35b386afb701ed3579d2c7f5f"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.919908 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" event={"ID":"b73e8f21-70b9-4f4a-b96b-f255e80db992","Type":"ContainerStarted","Data":"52c2ee12e5489301431d397ca33a92f5c480494d1ce29d77964ed1fcb237d8ae"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.926945 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" event={"ID":"61306334-5c80-4b48-8c47-bbc9a26f5ef3","Type":"ContainerStarted","Data":"28c64bc107072012c377819a2ec015e8596efbacab605893562c47ccab1f8771"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.928105 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" event={"ID":"d4245835-8bf3-4491-9e66-a456d2fea83d","Type":"ContainerStarted","Data":"fa3b2aea75d30f0eb5fcd6bbc674e9ede71afa792814ce70887e111a05b27969"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.928882 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" event={"ID":"cb54fddc-8710-4066-908b-bb7a00a15c7e","Type":"ContainerStarted","Data":"1ae440af0983208bbe10f20b44908f7928b9cfdfda77e81026fb989a2bb6fe16"} Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.985141 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:18 crc kubenswrapper[4740]: I0105 14:05:18.985364 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.985621 4740 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.985720 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:20.98569975 +0000 UTC m=+970.292608349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "webhook-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.986299 4740 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 05 14:05:18 crc kubenswrapper[4740]: E0105 14:05:18.986334 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:20.986323357 +0000 UTC m=+970.293231946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "metrics-server-cert" not found Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.093639 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.162936 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp"] Jan 05 14:05:19 crc kubenswrapper[4740]: W0105 14:05:19.176845 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda2bbca8_6d81_4fd9_b2a6_e52f98b7fb29.slice/crio-4de2bbdcb4f788ff2354dcfdf32338df525f93b8a6a8ba15437530dde2748642 WatchSource:0}: Error finding container 4de2bbdcb4f788ff2354dcfdf32338df525f93b8a6a8ba15437530dde2748642: Status 404 returned error can't find the container with id 4de2bbdcb4f788ff2354dcfdf32338df525f93b8a6a8ba15437530dde2748642 Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.179867 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.194189 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.204978 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.217381 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.237130 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.247981 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b"] Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.256493 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm"] Jan 05 14:05:19 crc kubenswrapper[4740]: E0105 14:05:19.280531 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zb6f8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-2hj8b_openstack-operators(9d82f35e-307a-4ed2-89e0-0649e5300e41): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.286723 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp"] Jan 05 14:05:19 crc kubenswrapper[4740]: E0105 14:05:19.286829 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" Jan 05 14:05:19 crc kubenswrapper[4740]: W0105 14:05:19.323048 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded1081fd_322e_4e98_aa00_e5aeef21b7b3.slice/crio-26f2c6245212edc9215d4c5456314e25d33b390150116daa52839c76fec80dd4 WatchSource:0}: Error finding container 26f2c6245212edc9215d4c5456314e25d33b390150116daa52839c76fec80dd4: Status 404 returned error can't find the container with id 26f2c6245212edc9215d4c5456314e25d33b390150116daa52839c76fec80dd4 Jan 05 14:05:19 crc kubenswrapper[4740]: E0105 14:05:19.326368 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c8922,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6ppzp_openstack-operators(ed1081fd-322e-4e98-aa00-e5aeef21b7b3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 05 14:05:19 crc kubenswrapper[4740]: E0105 14:05:19.327484 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" podUID="ed1081fd-322e-4e98-aa00-e5aeef21b7b3" Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.941749 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" event={"ID":"35577830-7016-49ad-bae0-8a9962a2e82c","Type":"ContainerStarted","Data":"7dd65e34188c00a37197f29b2642d583a3a141d1a5badacea6601e9da7f0f85a"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.942769 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" event={"ID":"71602a38-6096-4224-b49a-adfccfe02180","Type":"ContainerStarted","Data":"d2c861478f0272dc2a5931a04b74e2d9b73a3837fa9fd16b198ec17bc3703087"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.943994 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" event={"ID":"b8ae6035-6986-4e15-ac19-6e093c0a9e7a","Type":"ContainerStarted","Data":"d3e7f32bc361845f215a7b600b3272674c361ec7093e4cfa1833e418dfda8b57"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.945480 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" event={"ID":"9d82f35e-307a-4ed2-89e0-0649e5300e41","Type":"ContainerStarted","Data":"95c9a94a54bc065ab455375c50fc573b335295f6311d66698a59cb85b3511059"} Jan 05 14:05:19 crc kubenswrapper[4740]: E0105 14:05:19.949598 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.950354 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" event={"ID":"c5e3ed99-183e-41f6-bbee-d5c8e7f629d1","Type":"ContainerStarted","Data":"e739fdf0a2420069df0f511ca12cadeb37d818b4dcd47fc5eff249c5945d31d6"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.952330 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" event={"ID":"c7f964a1-5e19-4cb6-8e25-26fdc09410af","Type":"ContainerStarted","Data":"9c530271685c71b8c5d3ba018412fac3f587fdb2b2b8fd658add8b626a3a412b"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.953512 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" event={"ID":"ed1081fd-322e-4e98-aa00-e5aeef21b7b3","Type":"ContainerStarted","Data":"26f2c6245212edc9215d4c5456314e25d33b390150116daa52839c76fec80dd4"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.956379 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" event={"ID":"ab509865-4e08-4927-b702-f28bfb553a27","Type":"ContainerStarted","Data":"61274e930b5caedab110d9966d9bb215d5f85e1428d107a937e7bd97474e4f0f"} Jan 05 14:05:19 crc kubenswrapper[4740]: E0105 14:05:19.956462 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" podUID="ed1081fd-322e-4e98-aa00-e5aeef21b7b3" Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.958332 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" event={"ID":"d1228a5b-52ed-4d7e-940e-b4b03288fae5","Type":"ContainerStarted","Data":"495aa733a64b4ab7aad12bd06bbfccc92bffa68e1fcfdad04a39ff9166046ca1"} Jan 05 14:05:19 crc kubenswrapper[4740]: I0105 14:05:19.962202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" event={"ID":"da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29","Type":"ContainerStarted","Data":"4de2bbdcb4f788ff2354dcfdf32338df525f93b8a6a8ba15437530dde2748642"} Jan 05 14:05:20 crc kubenswrapper[4740]: I0105 14:05:20.333312 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:20 crc kubenswrapper[4740]: E0105 14:05:20.333507 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:20 crc kubenswrapper[4740]: E0105 14:05:20.333552 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert podName:3f0a6bbe-32b3-4e4d-afef-32e871616c6d nodeName:}" failed. No retries permitted until 2026-01-05 14:05:24.33353926 +0000 UTC m=+973.640447839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert") pod "infra-operator-controller-manager-6d99759cf-mglsd" (UID: "3f0a6bbe-32b3-4e4d-afef-32e871616c6d") : secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:20 crc kubenswrapper[4740]: I0105 14:05:20.540871 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:20 crc kubenswrapper[4740]: E0105 14:05:20.541027 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:20 crc kubenswrapper[4740]: E0105 14:05:20.541106 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert podName:6faf05ee-49e0-4d3e-afcd-d11d9494da44 nodeName:}" failed. No retries permitted until 2026-01-05 14:05:24.541086928 +0000 UTC m=+973.847995507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert") pod "openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" (UID: "6faf05ee-49e0-4d3e-afcd-d11d9494da44") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:21 crc kubenswrapper[4740]: E0105 14:05:21.010418 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" Jan 05 14:05:21 crc kubenswrapper[4740]: E0105 14:05:21.032396 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" podUID="ed1081fd-322e-4e98-aa00-e5aeef21b7b3" Jan 05 14:05:21 crc kubenswrapper[4740]: I0105 14:05:21.047992 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:21 crc kubenswrapper[4740]: I0105 14:05:21.048116 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:21 crc kubenswrapper[4740]: E0105 14:05:21.048267 4740 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 05 14:05:21 crc kubenswrapper[4740]: E0105 14:05:21.048391 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:25.048372218 +0000 UTC m=+974.355280797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "webhook-server-cert" not found Jan 05 14:05:21 crc kubenswrapper[4740]: E0105 14:05:21.048278 4740 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 05 14:05:21 crc kubenswrapper[4740]: E0105 14:05:21.048604 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:25.048572573 +0000 UTC m=+974.355481152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "metrics-server-cert" not found Jan 05 14:05:24 crc kubenswrapper[4740]: I0105 14:05:24.411208 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:24 crc kubenswrapper[4740]: E0105 14:05:24.411643 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:24 crc kubenswrapper[4740]: E0105 14:05:24.411687 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert podName:3f0a6bbe-32b3-4e4d-afef-32e871616c6d nodeName:}" failed. No retries permitted until 2026-01-05 14:05:32.411673993 +0000 UTC m=+981.718582572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert") pod "infra-operator-controller-manager-6d99759cf-mglsd" (UID: "3f0a6bbe-32b3-4e4d-afef-32e871616c6d") : secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:24 crc kubenswrapper[4740]: I0105 14:05:24.614550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:24 crc kubenswrapper[4740]: E0105 14:05:24.614701 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:24 crc kubenswrapper[4740]: E0105 14:05:24.614913 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert podName:6faf05ee-49e0-4d3e-afcd-d11d9494da44 nodeName:}" failed. No retries permitted until 2026-01-05 14:05:32.614899616 +0000 UTC m=+981.921808195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert") pod "openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" (UID: "6faf05ee-49e0-4d3e-afcd-d11d9494da44") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:25 crc kubenswrapper[4740]: I0105 14:05:25.124241 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:25 crc kubenswrapper[4740]: I0105 14:05:25.124412 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:25 crc kubenswrapper[4740]: E0105 14:05:25.124552 4740 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 05 14:05:25 crc kubenswrapper[4740]: E0105 14:05:25.124657 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:33.124631852 +0000 UTC m=+982.431540491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "webhook-server-cert" not found Jan 05 14:05:25 crc kubenswrapper[4740]: E0105 14:05:25.124678 4740 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 05 14:05:25 crc kubenswrapper[4740]: E0105 14:05:25.124752 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:33.124735234 +0000 UTC m=+982.431643813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "metrics-server-cert" not found Jan 05 14:05:31 crc kubenswrapper[4740]: E0105 14:05:31.960119 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:202756538820b5fa874d07a71ece4f048f41ccca8228d359c8cd25a00e9c0848" Jan 05 14:05:31 crc kubenswrapper[4740]: E0105 14:05:31.961109 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:202756538820b5fa874d07a71ece4f048f41ccca8228d359c8cd25a00e9c0848,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cn8td,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-f99f54bc8-9s2d8_openstack-operators(cb54fddc-8710-4066-908b-bb7a00a15c7e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:31 crc kubenswrapper[4740]: E0105 14:05:31.962382 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" Jan 05 14:05:32 crc kubenswrapper[4740]: E0105 14:05:32.120188 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:202756538820b5fa874d07a71ece4f048f41ccca8228d359c8cd25a00e9c0848\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" Jan 05 14:05:32 crc kubenswrapper[4740]: I0105 14:05:32.468177 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:32 crc kubenswrapper[4740]: E0105 14:05:32.468329 4740 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:32 crc kubenswrapper[4740]: E0105 14:05:32.468386 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert podName:3f0a6bbe-32b3-4e4d-afef-32e871616c6d nodeName:}" failed. No retries permitted until 2026-01-05 14:05:48.468371598 +0000 UTC m=+997.775280177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert") pod "infra-operator-controller-manager-6d99759cf-mglsd" (UID: "3f0a6bbe-32b3-4e4d-afef-32e871616c6d") : secret "infra-operator-webhook-server-cert" not found Jan 05 14:05:32 crc kubenswrapper[4740]: I0105 14:05:32.671356 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:32 crc kubenswrapper[4740]: E0105 14:05:32.671589 4740 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:32 crc kubenswrapper[4740]: E0105 14:05:32.671681 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert podName:6faf05ee-49e0-4d3e-afcd-d11d9494da44 nodeName:}" failed. No retries permitted until 2026-01-05 14:05:48.671654202 +0000 UTC m=+997.978562821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert") pod "openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" (UID: "6faf05ee-49e0-4d3e-afcd-d11d9494da44") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 05 14:05:33 crc kubenswrapper[4740]: I0105 14:05:33.202215 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:33 crc kubenswrapper[4740]: I0105 14:05:33.202700 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:33 crc kubenswrapper[4740]: E0105 14:05:33.202355 4740 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 05 14:05:33 crc kubenswrapper[4740]: E0105 14:05:33.202924 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:49.202906413 +0000 UTC m=+998.509814992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "metrics-server-cert" not found Jan 05 14:05:33 crc kubenswrapper[4740]: E0105 14:05:33.202864 4740 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 05 14:05:33 crc kubenswrapper[4740]: E0105 14:05:33.203328 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs podName:49bbef73-8653-4747-93ee-35819a394b1f nodeName:}" failed. No retries permitted until 2026-01-05 14:05:49.203319245 +0000 UTC m=+998.510227824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs") pod "openstack-operator-controller-manager-779f597f97-v7z84" (UID: "49bbef73-8653-4747-93ee-35819a394b1f") : secret "webhook-server-cert" not found Jan 05 14:05:34 crc kubenswrapper[4740]: E0105 14:05:34.758188 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:573d7dba212cbc32101496a7cbe01e391af9891bed3bec717f16bed4d6c23e04" Jan 05 14:05:34 crc kubenswrapper[4740]: E0105 14:05:34.758583 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:573d7dba212cbc32101496a7cbe01e391af9891bed3bec717f16bed4d6c23e04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q4vlk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-658dd65b86-lz62l_openstack-operators(e2dc84c3-c204-4f17-bcf3-418ab17b873d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:34 crc kubenswrapper[4740]: E0105 14:05:34.759893 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" podUID="e2dc84c3-c204-4f17-bcf3-418ab17b873d" Jan 05 14:05:35 crc kubenswrapper[4740]: E0105 14:05:35.151966 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:573d7dba212cbc32101496a7cbe01e391af9891bed3bec717f16bed4d6c23e04\\\"\"" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" podUID="e2dc84c3-c204-4f17-bcf3-418ab17b873d" Jan 05 14:05:36 crc kubenswrapper[4740]: E0105 14:05:36.308219 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:19345236c6b6bd5ae772e336fa6065c6e94c8990d1bf05d30073ddb95ffffb4d" Jan 05 14:05:36 crc kubenswrapper[4740]: E0105 14:05:36.308987 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:19345236c6b6bd5ae772e336fa6065c6e94c8990d1bf05d30073ddb95ffffb4d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hvnnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-7b549fc966-7jlfq_openstack-operators(d8ffad98-ed22-4c4c-b0b8-234c3358089e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:36 crc kubenswrapper[4740]: E0105 14:05:36.310367 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podUID="d8ffad98-ed22-4c4c-b0b8-234c3358089e" Jan 05 14:05:37 crc kubenswrapper[4740]: E0105 14:05:37.169779 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:19345236c6b6bd5ae772e336fa6065c6e94c8990d1bf05d30073ddb95ffffb4d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podUID="d8ffad98-ed22-4c4c-b0b8-234c3358089e" Jan 05 14:05:37 crc kubenswrapper[4740]: E0105 14:05:37.876565 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:c10647131e6fa6afeb11ea28e513b60f22dbfbb4ddc3727850b1fe5799890c41" Jan 05 14:05:37 crc kubenswrapper[4740]: E0105 14:05:37.876801 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:c10647131e6fa6afeb11ea28e513b60f22dbfbb4ddc3727850b1fe5799890c41,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bm455,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b88bfc995-vqb8k_openstack-operators(b73e8f21-70b9-4f4a-b96b-f255e80db992): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:37 crc kubenswrapper[4740]: E0105 14:05:37.878017 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" podUID="b73e8f21-70b9-4f4a-b96b-f255e80db992" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.179743 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:c10647131e6fa6afeb11ea28e513b60f22dbfbb4ddc3727850b1fe5799890c41\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" podUID="b73e8f21-70b9-4f4a-b96b-f255e80db992" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.482905 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:afb66a0f8e1aa057888f7c304cc34cfea711805d9d1f05798aceb4029fef2989" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.483088 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:afb66a0f8e1aa057888f7c304cc34cfea711805d9d1f05798aceb4029fef2989,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtmln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-f6f74d6db-jqccs_openstack-operators(3868391b-95fe-40be-a77d-593ea72fd786): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.484669 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podUID="3868391b-95fe-40be-a77d-593ea72fd786" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.925693 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:f0ece9a81e4be3dbc1ff752a951970380546d8c0dea910953f862c219444b97a" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.925927 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:f0ece9a81e4be3dbc1ff752a951970380546d8c0dea910953f862c219444b97a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jz5ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-9dbdf6486-tfjlm_openstack-operators(b8ae6035-6986-4e15-ac19-6e093c0a9e7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:38 crc kubenswrapper[4740]: E0105 14:05:38.927303 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" podUID="b8ae6035-6986-4e15-ac19-6e093c0a9e7a" Jan 05 14:05:39 crc kubenswrapper[4740]: E0105 14:05:39.187184 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:f0ece9a81e4be3dbc1ff752a951970380546d8c0dea910953f862c219444b97a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" podUID="b8ae6035-6986-4e15-ac19-6e093c0a9e7a" Jan 05 14:05:39 crc kubenswrapper[4740]: E0105 14:05:39.187299 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:afb66a0f8e1aa057888f7c304cc34cfea711805d9d1f05798aceb4029fef2989\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podUID="3868391b-95fe-40be-a77d-593ea72fd786" Jan 05 14:05:39 crc kubenswrapper[4740]: E0105 14:05:39.420527 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Jan 05 14:05:39 crc kubenswrapper[4740]: E0105 14:05:39.420696 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qp558,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-68c649d9d-8tvrz_openstack-operators(da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:39 crc kubenswrapper[4740]: E0105 14:05:39.421905 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" podUID="da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29" Jan 05 14:05:40 crc kubenswrapper[4740]: E0105 14:05:40.198320 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" podUID="da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29" Jan 05 14:05:41 crc kubenswrapper[4740]: E0105 14:05:41.331146 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:174acf70c084144827fb8f96c5401a0a8def953bf0ff8929dccd629a550491b7" Jan 05 14:05:41 crc kubenswrapper[4740]: E0105 14:05:41.331914 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:174acf70c084144827fb8f96c5401a0a8def953bf0ff8929dccd629a550491b7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7467q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-78979fc445-5dgxw_openstack-operators(a07332cc-11af-4d3a-8761-891417586bd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:41 crc kubenswrapper[4740]: E0105 14:05:41.333366 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" podUID="a07332cc-11af-4d3a-8761-891417586bd1" Jan 05 14:05:41 crc kubenswrapper[4740]: E0105 14:05:41.815386 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:879d3d679b58ae84419b7907ad092ad4d24bcc9222ce621ce464fd0fea347b0c" Jan 05 14:05:41 crc kubenswrapper[4740]: E0105 14:05:41.815767 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:879d3d679b58ae84419b7907ad092ad4d24bcc9222ce621ce464fd0fea347b0c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8mp4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-568985c78-7sdkl_openstack-operators(d4245835-8bf3-4491-9e66-a456d2fea83d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:41 crc kubenswrapper[4740]: E0105 14:05:41.817796 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podUID="d4245835-8bf3-4491-9e66-a456d2fea83d" Jan 05 14:05:42 crc kubenswrapper[4740]: E0105 14:05:42.232037 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:174acf70c084144827fb8f96c5401a0a8def953bf0ff8929dccd629a550491b7\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" podUID="a07332cc-11af-4d3a-8761-891417586bd1" Jan 05 14:05:42 crc kubenswrapper[4740]: E0105 14:05:42.232862 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:879d3d679b58ae84419b7907ad092ad4d24bcc9222ce621ce464fd0fea347b0c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podUID="d4245835-8bf3-4491-9e66-a456d2fea83d" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.129406 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.130501 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fcrgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66f8b87655-hfcj9_openstack-operators(01f58b56-275e-432c-aecc-f9853194f0fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.131759 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" podUID="01f58b56-275e-432c-aecc-f9853194f0fd" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.281560 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" podUID="01f58b56-275e-432c-aecc-f9853194f0fd" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.702487 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.702662 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k6zq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6c866cfdcb-jbqnp_openstack-operators(c5e3ed99-183e-41f6-bbee-d5c8e7f629d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.705195 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" podUID="c5e3ed99-183e-41f6-bbee-d5c8e7f629d1" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.774781 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:9533fa79d915abe9beaf16e5c08baaa4a197eecd" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.774834 4740 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:9533fa79d915abe9beaf16e5c08baaa4a197eecd" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.774977 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:9533fa79d915abe9beaf16e5c08baaa4a197eecd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n45h5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7dc6b6df78-rkxtw_openstack-operators(c7f964a1-5e19-4cb6-8e25-26fdc09410af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:45 crc kubenswrapper[4740]: E0105 14:05:45.776268 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" podUID="c7f964a1-5e19-4cb6-8e25-26fdc09410af" Jan 05 14:05:46 crc kubenswrapper[4740]: E0105 14:05:46.283751 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6\\\"\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" podUID="c5e3ed99-183e-41f6-bbee-d5c8e7f629d1" Jan 05 14:05:46 crc kubenswrapper[4740]: E0105 14:05:46.283784 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:9533fa79d915abe9beaf16e5c08baaa4a197eecd\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" podUID="c7f964a1-5e19-4cb6-8e25-26fdc09410af" Jan 05 14:05:48 crc kubenswrapper[4740]: I0105 14:05:48.534336 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:48 crc kubenswrapper[4740]: I0105 14:05:48.557886 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3f0a6bbe-32b3-4e4d-afef-32e871616c6d-cert\") pod \"infra-operator-controller-manager-6d99759cf-mglsd\" (UID: \"3f0a6bbe-32b3-4e4d-afef-32e871616c6d\") " pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:48 crc kubenswrapper[4740]: I0105 14:05:48.738679 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:48 crc kubenswrapper[4740]: I0105 14:05:48.745545 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6faf05ee-49e0-4d3e-afcd-d11d9494da44-cert\") pod \"openstack-baremetal-operator-controller-manager-78948ddfd7m88wc\" (UID: \"6faf05ee-49e0-4d3e-afcd-d11d9494da44\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:48 crc kubenswrapper[4740]: I0105 14:05:48.789725 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:48 crc kubenswrapper[4740]: I0105 14:05:48.851284 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:48 crc kubenswrapper[4740]: E0105 14:05:48.862957 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 05 14:05:48 crc kubenswrapper[4740]: E0105 14:05:48.863267 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c8922,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6ppzp_openstack-operators(ed1081fd-322e-4e98-aa00-e5aeef21b7b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:05:48 crc kubenswrapper[4740]: E0105 14:05:48.864513 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" podUID="ed1081fd-322e-4e98-aa00-e5aeef21b7b3" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.248403 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.248787 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.255464 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-webhook-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.255636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49bbef73-8653-4747-93ee-35819a394b1f-metrics-certs\") pod \"openstack-operator-controller-manager-779f597f97-v7z84\" (UID: \"49bbef73-8653-4747-93ee-35819a394b1f\") " pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.309903 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" event={"ID":"61306334-5c80-4b48-8c47-bbc9a26f5ef3","Type":"ContainerStarted","Data":"2526410e1bbf55cd276931b7b9af6e7ccf3f08e97cb6c57bc6552303129d7ab1"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.390811 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc"] Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:49.395394 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.318902 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" event={"ID":"e2dc84c3-c204-4f17-bcf3-418ab17b873d","Type":"ContainerStarted","Data":"fd8fe36ea0f348e2977041bb58a978ef23357266a1632eee7a9af05e35272c95"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.319593 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.320272 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" event={"ID":"71602a38-6096-4224-b49a-adfccfe02180","Type":"ContainerStarted","Data":"9b6dfe46ae3f21a07dd1830e6d3e8c1779168379dc0561cc2763b2c98655f3f0"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.320443 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.321432 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" event={"ID":"d1228a5b-52ed-4d7e-940e-b4b03288fae5","Type":"ContainerStarted","Data":"ba7bf1e88e93ef042b6541a26102b84bd9b2132d99560fe665b7a7b8271f4bd5"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.321546 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.322424 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" event={"ID":"9d82f35e-307a-4ed2-89e0-0649e5300e41","Type":"ContainerStarted","Data":"5bb3638cb03ee3d657af2feb0002f1eafeaeadc78e629e2f3d4371cc9074b570"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.323145 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.324104 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" event={"ID":"ca259c15-4c6d-4142-b257-12e805385d3f","Type":"ContainerStarted","Data":"7506dddb231aee0e1cee39af9437c2f5d59c9ee139ae9a16abba79ee456d8b61"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.324462 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.325157 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" event={"ID":"6faf05ee-49e0-4d3e-afcd-d11d9494da44","Type":"ContainerStarted","Data":"0ffbf682a934af6d3ab7341b527854aa38796676e4becd3dffd8ad0aa843983c"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.326024 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" event={"ID":"35577830-7016-49ad-bae0-8a9962a2e82c","Type":"ContainerStarted","Data":"53f4a1b3349175583ea79e479e918afd1051180300448e8801399631e62ac32a"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.326417 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.327537 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" event={"ID":"ab509865-4e08-4927-b702-f28bfb553a27","Type":"ContainerStarted","Data":"fa8c72cdddd7187bcdd5ba1dc0266a98381fc257c7451a5d61451ec21026cedd"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.327615 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.328573 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" event={"ID":"cb54fddc-8710-4066-908b-bb7a00a15c7e","Type":"ContainerStarted","Data":"46af320ca584a7bd1ea5544687ffc6b2fa65c91fe544c8fefd1f91910415e9db"} Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.328889 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.328995 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.341873 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" podStartSLOduration=3.647670286 podStartE2EDuration="34.341862063s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.23902275 +0000 UTC m=+967.545931329" lastFinishedPulling="2026-01-05 14:05:48.933214487 +0000 UTC m=+998.240123106" observedRunningTime="2026-01-05 14:05:50.337960239 +0000 UTC m=+999.644868808" watchObservedRunningTime="2026-01-05 14:05:50.341862063 +0000 UTC m=+999.648770642" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.365647 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podStartSLOduration=4.77978825 podStartE2EDuration="34.365631968s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.280400798 +0000 UTC m=+968.587309377" lastFinishedPulling="2026-01-05 14:05:48.866244496 +0000 UTC m=+998.173153095" observedRunningTime="2026-01-05 14:05:50.358654132 +0000 UTC m=+999.665562711" watchObservedRunningTime="2026-01-05 14:05:50.365631968 +0000 UTC m=+999.672540547" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.391579 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" podStartSLOduration=6.296714021 podStartE2EDuration="34.391564991s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.798799494 +0000 UTC m=+968.105708073" lastFinishedPulling="2026-01-05 14:05:46.893650464 +0000 UTC m=+996.200559043" observedRunningTime="2026-01-05 14:05:50.385557381 +0000 UTC m=+999.692465960" watchObservedRunningTime="2026-01-05 14:05:50.391564991 +0000 UTC m=+999.698473570" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.408058 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" podStartSLOduration=5.245488371 podStartE2EDuration="34.408038412s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.238848996 +0000 UTC m=+967.545757575" lastFinishedPulling="2026-01-05 14:05:47.401399037 +0000 UTC m=+996.708307616" observedRunningTime="2026-01-05 14:05:50.401821256 +0000 UTC m=+999.708729835" watchObservedRunningTime="2026-01-05 14:05:50.408038412 +0000 UTC m=+999.714947001" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.421687 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" podStartSLOduration=6.788620601 podStartE2EDuration="34.421670666s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.260607189 +0000 UTC m=+968.567515768" lastFinishedPulling="2026-01-05 14:05:46.893657214 +0000 UTC m=+996.200565833" observedRunningTime="2026-01-05 14:05:50.413882258 +0000 UTC m=+999.720790857" watchObservedRunningTime="2026-01-05 14:05:50.421670666 +0000 UTC m=+999.728579245" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.428447 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" podStartSLOduration=6.286782575 podStartE2EDuration="34.428440297s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.260202057 +0000 UTC m=+968.567110626" lastFinishedPulling="2026-01-05 14:05:47.401859769 +0000 UTC m=+996.708768348" observedRunningTime="2026-01-05 14:05:50.427477752 +0000 UTC m=+999.734386331" watchObservedRunningTime="2026-01-05 14:05:50.428440297 +0000 UTC m=+999.735348876" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.448914 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podStartSLOduration=4.109163974 podStartE2EDuration="34.448899885s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.508830602 +0000 UTC m=+967.815739181" lastFinishedPulling="2026-01-05 14:05:48.848566493 +0000 UTC m=+998.155475092" observedRunningTime="2026-01-05 14:05:50.447128987 +0000 UTC m=+999.754037566" watchObservedRunningTime="2026-01-05 14:05:50.448899885 +0000 UTC m=+999.755808464" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.481858 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" podStartSLOduration=6.848941394 podStartE2EDuration="34.481844685s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.260892537 +0000 UTC m=+968.567801116" lastFinishedPulling="2026-01-05 14:05:46.893795808 +0000 UTC m=+996.200704407" observedRunningTime="2026-01-05 14:05:50.480830818 +0000 UTC m=+999.787739397" watchObservedRunningTime="2026-01-05 14:05:50.481844685 +0000 UTC m=+999.788753264" Jan 05 14:05:52 crc kubenswrapper[4740]: I0105 14:05:50.504576 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" podStartSLOduration=5.236627993 podStartE2EDuration="34.504556063s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.134316412 +0000 UTC m=+968.441224991" lastFinishedPulling="2026-01-05 14:05:48.402244482 +0000 UTC m=+997.709153061" observedRunningTime="2026-01-05 14:05:50.500911654 +0000 UTC m=+999.807820233" watchObservedRunningTime="2026-01-05 14:05:50.504556063 +0000 UTC m=+999.811464642" Jan 05 14:05:52 crc kubenswrapper[4740]: W0105 14:05:52.996438 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f0a6bbe_32b3_4e4d_afef_32e871616c6d.slice/crio-775b655b8dfc0b77aec8f6bc55e41f6de9aeedd5b9ba3d0ea301945ad594b5dd WatchSource:0}: Error finding container 775b655b8dfc0b77aec8f6bc55e41f6de9aeedd5b9ba3d0ea301945ad594b5dd: Status 404 returned error can't find the container with id 775b655b8dfc0b77aec8f6bc55e41f6de9aeedd5b9ba3d0ea301945ad594b5dd Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.002213 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd"] Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.069306 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84"] Jan 05 14:05:53 crc kubenswrapper[4740]: W0105 14:05:53.071650 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49bbef73_8653_4747_93ee_35819a394b1f.slice/crio-04c047128eb7a4246826442bdce5463c00c59381c533e3ec74ff12a62151018b WatchSource:0}: Error finding container 04c047128eb7a4246826442bdce5463c00c59381c533e3ec74ff12a62151018b: Status 404 returned error can't find the container with id 04c047128eb7a4246826442bdce5463c00c59381c533e3ec74ff12a62151018b Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.382614 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" event={"ID":"b8ae6035-6986-4e15-ac19-6e093c0a9e7a","Type":"ContainerStarted","Data":"62eefe8359d180111cde779e13a30d1ed0330dea4905d74de5d1b536ab11c7e7"} Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.382922 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.385823 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" event={"ID":"49bbef73-8653-4747-93ee-35819a394b1f","Type":"ContainerStarted","Data":"7cc81495e396912ef7463ba8c6c0f0fa9baec5c74d5bb772c49af7be5681465d"} Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.385857 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" event={"ID":"49bbef73-8653-4747-93ee-35819a394b1f","Type":"ContainerStarted","Data":"04c047128eb7a4246826442bdce5463c00c59381c533e3ec74ff12a62151018b"} Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.386003 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.390352 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" event={"ID":"3868391b-95fe-40be-a77d-593ea72fd786","Type":"ContainerStarted","Data":"0ccbb704a16425e00e5327b4dae033058262f95c49897f3972b9223dbdf21a3c"} Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.390560 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.394669 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" event={"ID":"d8ffad98-ed22-4c4c-b0b8-234c3358089e","Type":"ContainerStarted","Data":"1e19b89385cbd2a401bca8f036c69c9b7780bf0cfd558c55f4d6532b07556da6"} Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.394848 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.396398 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" event={"ID":"3f0a6bbe-32b3-4e4d-afef-32e871616c6d","Type":"ContainerStarted","Data":"775b655b8dfc0b77aec8f6bc55e41f6de9aeedd5b9ba3d0ea301945ad594b5dd"} Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.401455 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" podStartSLOduration=3.827091292 podStartE2EDuration="37.401435833s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.258117002 +0000 UTC m=+968.565025581" lastFinishedPulling="2026-01-05 14:05:52.832461533 +0000 UTC m=+1002.139370122" observedRunningTime="2026-01-05 14:05:53.39868844 +0000 UTC m=+1002.705597039" watchObservedRunningTime="2026-01-05 14:05:53.401435833 +0000 UTC m=+1002.708344412" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.438757 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" podStartSLOduration=37.43873389 podStartE2EDuration="37.43873389s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:05:53.422699811 +0000 UTC m=+1002.729608400" watchObservedRunningTime="2026-01-05 14:05:53.43873389 +0000 UTC m=+1002.745642469" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.465366 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podStartSLOduration=2.436409657 podStartE2EDuration="37.465338881s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:17.801578017 +0000 UTC m=+967.108486596" lastFinishedPulling="2026-01-05 14:05:52.830507231 +0000 UTC m=+1002.137415820" observedRunningTime="2026-01-05 14:05:53.436993873 +0000 UTC m=+1002.743902452" watchObservedRunningTime="2026-01-05 14:05:53.465338881 +0000 UTC m=+1002.772247460" Jan 05 14:05:53 crc kubenswrapper[4740]: I0105 14:05:53.469783 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podStartSLOduration=2.856120718 podStartE2EDuration="37.46977045s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.21843292 +0000 UTC m=+967.525341499" lastFinishedPulling="2026-01-05 14:05:52.832082652 +0000 UTC m=+1002.138991231" observedRunningTime="2026-01-05 14:05:53.457368729 +0000 UTC m=+1002.764277328" watchObservedRunningTime="2026-01-05 14:05:53.46977045 +0000 UTC m=+1002.776679029" Jan 05 14:05:56 crc kubenswrapper[4740]: I0105 14:05:56.784215 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" Jan 05 14:05:56 crc kubenswrapper[4740]: I0105 14:05:56.838218 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.041474 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.092871 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.119177 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.141259 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.246161 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.325614 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.361430 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.431272 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" event={"ID":"3f0a6bbe-32b3-4e4d-afef-32e871616c6d","Type":"ContainerStarted","Data":"9fd7f35b8fadff57344bdb0d75f1702759b27061ab94753b3a893a234b4f16ff"} Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.431518 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.432652 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" event={"ID":"b73e8f21-70b9-4f4a-b96b-f255e80db992","Type":"ContainerStarted","Data":"9727417cd82607621499fe7638f782610c32d43670be67889d200739964ed9d0"} Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.432805 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.434198 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" event={"ID":"a07332cc-11af-4d3a-8761-891417586bd1","Type":"ContainerStarted","Data":"9363f7a2b64c905e88e7bf96d709fa0de38da6a37d799b5711d558e4c7d1af55"} Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.434397 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.435454 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" event={"ID":"da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29","Type":"ContainerStarted","Data":"1c13951c17d1d65a589c9ff7c1552adebcee1007dc5e58093a5e206ed1e31cf9"} Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.435616 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.440444 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" event={"ID":"6faf05ee-49e0-4d3e-afcd-d11d9494da44","Type":"ContainerStarted","Data":"aa80dbbbd07fd951e055dc8e080fa1a662f8321c3e1667415eea75d0594a5bff"} Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.440661 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.468974 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" podStartSLOduration=37.904265746 podStartE2EDuration="41.468957378s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:52.998200724 +0000 UTC m=+1002.305109313" lastFinishedPulling="2026-01-05 14:05:56.562892356 +0000 UTC m=+1005.869800945" observedRunningTime="2026-01-05 14:05:57.460390199 +0000 UTC m=+1006.767298848" watchObservedRunningTime="2026-01-05 14:05:57.468957378 +0000 UTC m=+1006.775865957" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.489164 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" podStartSLOduration=2.528537531 podStartE2EDuration="41.489129717s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:17.58022304 +0000 UTC m=+966.887131619" lastFinishedPulling="2026-01-05 14:05:56.540815186 +0000 UTC m=+1005.847723805" observedRunningTime="2026-01-05 14:05:57.486933838 +0000 UTC m=+1006.793842417" watchObservedRunningTime="2026-01-05 14:05:57.489129717 +0000 UTC m=+1006.796038296" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.522107 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" podStartSLOduration=3.848326662 podStartE2EDuration="41.522089229s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.801359993 +0000 UTC m=+968.108268572" lastFinishedPulling="2026-01-05 14:05:56.47512252 +0000 UTC m=+1005.782031139" observedRunningTime="2026-01-05 14:05:57.514820484 +0000 UTC m=+1006.821729063" watchObservedRunningTime="2026-01-05 14:05:57.522089229 +0000 UTC m=+1006.828997808" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.556915 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" podStartSLOduration=4.227400175 podStartE2EDuration="41.556897399s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.208343972 +0000 UTC m=+968.515252551" lastFinishedPulling="2026-01-05 14:05:56.537841156 +0000 UTC m=+1005.844749775" observedRunningTime="2026-01-05 14:05:57.551904555 +0000 UTC m=+1006.858813134" watchObservedRunningTime="2026-01-05 14:05:57.556897399 +0000 UTC m=+1006.863805978" Jan 05 14:05:57 crc kubenswrapper[4740]: I0105 14:05:57.590009 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" podStartSLOduration=34.468639752 podStartE2EDuration="41.589986853s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:49.415419227 +0000 UTC m=+998.722327806" lastFinishedPulling="2026-01-05 14:05:56.536766288 +0000 UTC m=+1005.843674907" observedRunningTime="2026-01-05 14:05:57.583731246 +0000 UTC m=+1006.890639825" watchObservedRunningTime="2026-01-05 14:05:57.589986853 +0000 UTC m=+1006.896895432" Jan 05 14:05:59 crc kubenswrapper[4740]: I0105 14:05:59.401385 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" Jan 05 14:05:59 crc kubenswrapper[4740]: I0105 14:05:59.481217 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" event={"ID":"d4245835-8bf3-4491-9e66-a456d2fea83d","Type":"ContainerStarted","Data":"b6f92ed4f76a726fbcf4f412fb71bbbfdf77ea63ae92a6a576c7862732534894"} Jan 05 14:05:59 crc kubenswrapper[4740]: I0105 14:05:59.481550 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:05:59 crc kubenswrapper[4740]: I0105 14:05:59.504380 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podStartSLOduration=3.393111622 podStartE2EDuration="43.504360519s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:18.501684512 +0000 UTC m=+967.808593091" lastFinishedPulling="2026-01-05 14:05:58.612933399 +0000 UTC m=+1007.919841988" observedRunningTime="2026-01-05 14:05:59.503165767 +0000 UTC m=+1008.810074346" watchObservedRunningTime="2026-01-05 14:05:59.504360519 +0000 UTC m=+1008.811269088" Jan 05 14:06:00 crc kubenswrapper[4740]: I0105 14:06:00.491513 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" event={"ID":"c7f964a1-5e19-4cb6-8e25-26fdc09410af","Type":"ContainerStarted","Data":"e70f19c72449bc0a5577059b4614f8d11fca25e942bf164febfcae53d337ce0e"} Jan 05 14:06:00 crc kubenswrapper[4740]: I0105 14:06:00.492013 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:06:00 crc kubenswrapper[4740]: I0105 14:06:00.493983 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" event={"ID":"01f58b56-275e-432c-aecc-f9853194f0fd","Type":"ContainerStarted","Data":"0e95db5924f0c1ad7ca622bd4ccccb299f1d43e85508c682b7caac2b4fba77d9"} Jan 05 14:06:00 crc kubenswrapper[4740]: I0105 14:06:00.494447 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:06:00 crc kubenswrapper[4740]: I0105 14:06:00.511863 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" podStartSLOduration=3.731774644 podStartE2EDuration="44.511829031s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.258433851 +0000 UTC m=+968.565342430" lastFinishedPulling="2026-01-05 14:06:00.038488238 +0000 UTC m=+1009.345396817" observedRunningTime="2026-01-05 14:06:00.50429364 +0000 UTC m=+1009.811202219" watchObservedRunningTime="2026-01-05 14:06:00.511829031 +0000 UTC m=+1009.818737601" Jan 05 14:06:00 crc kubenswrapper[4740]: I0105 14:06:00.524674 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" podStartSLOduration=2.312855915 podStartE2EDuration="44.524652614s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:17.633673299 +0000 UTC m=+966.940581878" lastFinishedPulling="2026-01-05 14:05:59.845469958 +0000 UTC m=+1009.152378577" observedRunningTime="2026-01-05 14:06:00.519197908 +0000 UTC m=+1009.826106497" watchObservedRunningTime="2026-01-05 14:06:00.524652614 +0000 UTC m=+1009.831561213" Jan 05 14:06:00 crc kubenswrapper[4740]: E0105 14:06:00.981043 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" podUID="ed1081fd-322e-4e98-aa00-e5aeef21b7b3" Jan 05 14:06:01 crc kubenswrapper[4740]: I0105 14:06:01.514348 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" event={"ID":"c5e3ed99-183e-41f6-bbee-d5c8e7f629d1","Type":"ContainerStarted","Data":"8f082623af10e5c4ba2246a401b9341baa5b32e6e7319dabd90c7c390a04da21"} Jan 05 14:06:01 crc kubenswrapper[4740]: I0105 14:06:01.515199 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:06:01 crc kubenswrapper[4740]: I0105 14:06:01.566692 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" podStartSLOduration=4.222857821 podStartE2EDuration="45.56666999s" podCreationTimestamp="2026-01-05 14:05:16 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.142242294 +0000 UTC m=+968.449150873" lastFinishedPulling="2026-01-05 14:06:00.486054463 +0000 UTC m=+1009.792963042" observedRunningTime="2026-01-05 14:06:01.542760191 +0000 UTC m=+1010.849668770" watchObservedRunningTime="2026-01-05 14:06:01.56666999 +0000 UTC m=+1010.873578579" Jan 05 14:06:01 crc kubenswrapper[4740]: I0105 14:06:01.915728 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:06:01 crc kubenswrapper[4740]: I0105 14:06:01.915813 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:06:06 crc kubenswrapper[4740]: I0105 14:06:06.575273 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" Jan 05 14:06:06 crc kubenswrapper[4740]: I0105 14:06:06.609000 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" Jan 05 14:06:06 crc kubenswrapper[4740]: I0105 14:06:06.682285 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 14:06:06 crc kubenswrapper[4740]: I0105 14:06:06.865918 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 14:06:07 crc kubenswrapper[4740]: I0105 14:06:07.070947 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 14:06:07 crc kubenswrapper[4740]: I0105 14:06:07.100986 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" Jan 05 14:06:07 crc kubenswrapper[4740]: I0105 14:06:07.192872 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" Jan 05 14:06:07 crc kubenswrapper[4740]: I0105 14:06:07.418422 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" Jan 05 14:06:07 crc kubenswrapper[4740]: I0105 14:06:07.439556 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" Jan 05 14:06:07 crc kubenswrapper[4740]: I0105 14:06:07.521725 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" Jan 05 14:06:08 crc kubenswrapper[4740]: I0105 14:06:08.799221 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 14:06:08 crc kubenswrapper[4740]: I0105 14:06:08.870544 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" Jan 05 14:06:14 crc kubenswrapper[4740]: I0105 14:06:14.660485 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" event={"ID":"ed1081fd-322e-4e98-aa00-e5aeef21b7b3","Type":"ContainerStarted","Data":"8c5317173250602caa068288b7db21d8826100e98ad64f51a1dae10eefc392aa"} Jan 05 14:06:14 crc kubenswrapper[4740]: I0105 14:06:14.696227 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6ppzp" podStartSLOduration=3.139621706 podStartE2EDuration="57.696201475s" podCreationTimestamp="2026-01-05 14:05:17 +0000 UTC" firstStartedPulling="2026-01-05 14:05:19.326254123 +0000 UTC m=+968.633162702" lastFinishedPulling="2026-01-05 14:06:13.882833892 +0000 UTC m=+1023.189742471" observedRunningTime="2026-01-05 14:06:14.67542283 +0000 UTC m=+1023.982331419" watchObservedRunningTime="2026-01-05 14:06:14.696201475 +0000 UTC m=+1024.003110094" Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.916581 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.917596 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.976946 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf57k"] Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.978587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.987559 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.987658 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.987968 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2hbvx" Jan 05 14:06:31 crc kubenswrapper[4740]: I0105 14:06:31.988738 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.001188 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf57k"] Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.043974 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wvwwn"] Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.046323 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.051302 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.069416 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wvwwn"] Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.115342 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7856c9-a723-48a1-bf77-d39e78f3d62c-config\") pod \"dnsmasq-dns-675f4bcbfc-rf57k\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.115435 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nrd\" (UniqueName: \"kubernetes.io/projected/4b7856c9-a723-48a1-bf77-d39e78f3d62c-kube-api-access-z7nrd\") pod \"dnsmasq-dns-675f4bcbfc-rf57k\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.218227 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-config\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.218406 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7856c9-a723-48a1-bf77-d39e78f3d62c-config\") pod \"dnsmasq-dns-675f4bcbfc-rf57k\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.218448 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.218513 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nrd\" (UniqueName: \"kubernetes.io/projected/4b7856c9-a723-48a1-bf77-d39e78f3d62c-kube-api-access-z7nrd\") pod \"dnsmasq-dns-675f4bcbfc-rf57k\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.219086 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfbf5\" (UniqueName: \"kubernetes.io/projected/1ff1f33b-0049-4089-9745-38a3da10de80-kube-api-access-hfbf5\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.219723 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7856c9-a723-48a1-bf77-d39e78f3d62c-config\") pod \"dnsmasq-dns-675f4bcbfc-rf57k\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.243891 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nrd\" (UniqueName: \"kubernetes.io/projected/4b7856c9-a723-48a1-bf77-d39e78f3d62c-kube-api-access-z7nrd\") pod \"dnsmasq-dns-675f4bcbfc-rf57k\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.309838 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.321093 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.321230 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfbf5\" (UniqueName: \"kubernetes.io/projected/1ff1f33b-0049-4089-9745-38a3da10de80-kube-api-access-hfbf5\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.321289 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-config\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.322583 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-config\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.323048 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.346308 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfbf5\" (UniqueName: \"kubernetes.io/projected/1ff1f33b-0049-4089-9745-38a3da10de80-kube-api-access-hfbf5\") pod \"dnsmasq-dns-78dd6ddcc-wvwwn\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.373876 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.758207 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf57k"] Jan 05 14:06:32 crc kubenswrapper[4740]: W0105 14:06:32.761760 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b7856c9_a723_48a1_bf77_d39e78f3d62c.slice/crio-51810b311715d16418bcd0937fcf07dfab90b6280c486505f4e56ad27758d4f6 WatchSource:0}: Error finding container 51810b311715d16418bcd0937fcf07dfab90b6280c486505f4e56ad27758d4f6: Status 404 returned error can't find the container with id 51810b311715d16418bcd0937fcf07dfab90b6280c486505f4e56ad27758d4f6 Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.866108 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" event={"ID":"4b7856c9-a723-48a1-bf77-d39e78f3d62c","Type":"ContainerStarted","Data":"51810b311715d16418bcd0937fcf07dfab90b6280c486505f4e56ad27758d4f6"} Jan 05 14:06:32 crc kubenswrapper[4740]: I0105 14:06:32.877758 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wvwwn"] Jan 05 14:06:32 crc kubenswrapper[4740]: W0105 14:06:32.886135 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ff1f33b_0049_4089_9745_38a3da10de80.slice/crio-a315cc2d7a96339923e84d2486836784591f978a9bdb1e6a8942eb7767f5b91b WatchSource:0}: Error finding container a315cc2d7a96339923e84d2486836784591f978a9bdb1e6a8942eb7767f5b91b: Status 404 returned error can't find the container with id a315cc2d7a96339923e84d2486836784591f978a9bdb1e6a8942eb7767f5b91b Jan 05 14:06:33 crc kubenswrapper[4740]: I0105 14:06:33.877596 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" event={"ID":"1ff1f33b-0049-4089-9745-38a3da10de80","Type":"ContainerStarted","Data":"a315cc2d7a96339923e84d2486836784591f978a9bdb1e6a8942eb7767f5b91b"} Jan 05 14:06:34 crc kubenswrapper[4740]: I0105 14:06:34.817765 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf57k"] Jan 05 14:06:34 crc kubenswrapper[4740]: I0105 14:06:34.846354 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bs4mf"] Jan 05 14:06:34 crc kubenswrapper[4740]: I0105 14:06:34.848378 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:34 crc kubenswrapper[4740]: I0105 14:06:34.859821 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bs4mf"] Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.011463 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-config\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.011781 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.011803 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdwx\" (UniqueName: \"kubernetes.io/projected/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-kube-api-access-mkdwx\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.113749 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-config\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.113850 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.113877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdwx\" (UniqueName: \"kubernetes.io/projected/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-kube-api-access-mkdwx\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.114798 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-config\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.114847 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.152093 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdwx\" (UniqueName: \"kubernetes.io/projected/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-kube-api-access-mkdwx\") pod \"dnsmasq-dns-666b6646f7-bs4mf\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.153653 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wvwwn"] Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.180618 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7fx9n"] Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.183589 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.184245 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.189205 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7fx9n"] Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.317101 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.317395 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5gf2\" (UniqueName: \"kubernetes.io/projected/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-kube-api-access-t5gf2\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.317439 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-config\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.419719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.419802 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5gf2\" (UniqueName: \"kubernetes.io/projected/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-kube-api-access-t5gf2\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.419866 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-config\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.421813 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-config\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.422805 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.470144 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5gf2\" (UniqueName: \"kubernetes.io/projected/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-kube-api-access-t5gf2\") pod \"dnsmasq-dns-57d769cc4f-7fx9n\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.527718 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.843491 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bs4mf"] Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.903686 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" event={"ID":"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e","Type":"ContainerStarted","Data":"cb35f0518089df11d7e4b07a9c654c785ddf77a73241db4b87e7507e8c22a37e"} Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.986592 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.989770 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.992925 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.993213 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.993410 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.993841 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.993875 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.994032 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7jslb" Jan 05 14:06:35 crc kubenswrapper[4740]: I0105 14:06:35.994161 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.040844 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.042490 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.051923 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.054331 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.067647 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.078381 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.092031 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.140041 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-config-data\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.140108 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.140327 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.140875 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxvnp\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-kube-api-access-jxvnp\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.140907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141091 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141156 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141334 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141378 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141482 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-config-data\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141504 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141554 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141578 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141635 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchns\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-kube-api-access-qchns\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.141782 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.143127 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.143298 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.143405 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.143452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245435 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245513 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245538 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245586 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-config-data\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245611 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245662 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245683 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245708 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qchns\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-kube-api-access-qchns\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245852 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245918 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245938 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.245966 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246005 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-config-data\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246022 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246048 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246125 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k26n\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-kube-api-access-2k26n\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246168 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-config-data\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246187 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246217 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246257 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxvnp\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-kube-api-access-jxvnp\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246273 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246287 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246333 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246352 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246369 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246420 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246440 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4396968c-d77b-434d-888f-3ab578514bbe-pod-info\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246456 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-server-conf\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246495 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246513 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.246533 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4396968c-d77b-434d-888f-3ab578514bbe-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.247309 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.247601 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.248255 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.248755 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.251496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-config-data\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.251682 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.253000 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.253266 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.253341 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.254014 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-config-data\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.254659 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.255636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.257625 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.257793 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.257818 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2f7b9b5abaa8990d589f1063e4eaf9d9b88b37cbfdc58c57397865c99b07d3b/globalmount\"" pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.257891 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.257914 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.257919 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2969f94af2ae2bb1af3fa7e4547ad87c9fa0b3a503c46d52774766932196392c/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.260409 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.272397 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.275506 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.280045 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchns\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-kube-api-access-qchns\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.280759 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxvnp\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-kube-api-access-jxvnp\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.285119 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.293538 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.294497 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.311369 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.315714 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.315733 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.315949 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.316288 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.317092 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.321078 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2n6bj" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.338184 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.343048 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.345873 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7fx9n"] Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348150 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348369 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348456 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348520 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4396968c-d77b-434d-888f-3ab578514bbe-pod-info\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348585 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-server-conf\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348651 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.348779 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4396968c-d77b-434d-888f-3ab578514bbe-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.349092 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.349296 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-config-data\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.349371 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.349442 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k26n\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-kube-api-access-2k26n\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.349009 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.354609 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.355156 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-config-data\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.355239 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4396968c-d77b-434d-888f-3ab578514bbe-pod-info\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.355504 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-server-conf\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.357188 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.359998 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4396968c-d77b-434d-888f-3ab578514bbe-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.358877 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.367735 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.375971 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.377245 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.377294 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db0cb9ade84c891cd0bb9eb2fcd30e636936804d545309ca85d72d0417d04c94/globalmount\"" pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.377638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k26n\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-kube-api-access-2k26n\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: W0105 14:06:36.378244 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b270e1a_8c5f_4c3f_b070_afa15ee2cda9.slice/crio-ca78bb24854395cf67f9a0557895ccf05c178357d90ee47338738e88d611a477 WatchSource:0}: Error finding container ca78bb24854395cf67f9a0557895ccf05c178357d90ee47338738e88d611a477: Status 404 returned error can't find the container with id ca78bb24854395cf67f9a0557895ccf05c178357d90ee47338738e88d611a477 Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.448387 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455569 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54c80dba-b90f-4288-a366-4ff77f76db22-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455629 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455679 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455745 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m9b7\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-kube-api-access-2m9b7\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455770 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455832 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455855 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455933 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455969 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54c80dba-b90f-4288-a366-4ff77f76db22-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.455991 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.557948 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.557998 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54c80dba-b90f-4288-a366-4ff77f76db22-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558020 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558052 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54c80dba-b90f-4288-a366-4ff77f76db22-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558103 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558138 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558164 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558185 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m9b7\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-kube-api-access-2m9b7\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558202 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558247 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.558262 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.559366 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.559573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.559889 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.562885 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.562956 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.562977 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db121f6067a3dd90232a4eb13ff6330ceaa43b4082caf1466f839106ba50b025/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.563592 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.571832 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54c80dba-b90f-4288-a366-4ff77f76db22-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.572567 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.573681 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.573705 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54c80dba-b90f-4288-a366-4ff77f76db22-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.582867 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m9b7\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-kube-api-access-2m9b7\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.598689 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.647441 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.676678 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.687406 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.765872 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:06:36 crc kubenswrapper[4740]: I0105 14:06:36.920036 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" event={"ID":"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9","Type":"ContainerStarted","Data":"ca78bb24854395cf67f9a0557895ccf05c178357d90ee47338738e88d611a477"} Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.305550 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.309765 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:06:37 crc kubenswrapper[4740]: W0105 14:06:37.321316 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc8ef80_7c3c_4e61_8eaf_294ab4a75299.slice/crio-253c93cf4975e48e292f3708d36f0d8c41fe5dfc3d05db513a6a0174493b358e WatchSource:0}: Error finding container 253c93cf4975e48e292f3708d36f0d8c41fe5dfc3d05db513a6a0174493b358e: Status 404 returned error can't find the container with id 253c93cf4975e48e292f3708d36f0d8c41fe5dfc3d05db513a6a0174493b358e Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.459806 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.462373 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.469961 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.471678 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.471810 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.472805 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-6d76t" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.472916 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.478657 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.481313 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577026 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkpc\" (UniqueName: \"kubernetes.io/projected/dbb5263b-e98b-48a4-825e-ffb99738059f-kube-api-access-tfkpc\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577111 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb5263b-e98b-48a4-825e-ffb99738059f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577139 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb5263b-e98b-48a4-825e-ffb99738059f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577177 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577205 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbb5263b-e98b-48a4-825e-ffb99738059f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577241 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577291 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.577647 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.623819 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681486 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb5263b-e98b-48a4-825e-ffb99738059f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb5263b-e98b-48a4-825e-ffb99738059f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681576 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681624 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbb5263b-e98b-48a4-825e-ffb99738059f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681659 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681733 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681805 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.681833 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkpc\" (UniqueName: \"kubernetes.io/projected/dbb5263b-e98b-48a4-825e-ffb99738059f-kube-api-access-tfkpc\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.682306 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbb5263b-e98b-48a4-825e-ffb99738059f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.683647 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.684999 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.685299 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbb5263b-e98b-48a4-825e-ffb99738059f-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.687890 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb5263b-e98b-48a4-825e-ffb99738059f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.692790 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.692847 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd7ecf7ddab7ebc969bbe285d3c4ffeffb4e5bcd0b09dcb3745dab2800ffb7ec/globalmount\"" pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.698140 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb5263b-e98b-48a4-825e-ffb99738059f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.698467 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkpc\" (UniqueName: \"kubernetes.io/projected/dbb5263b-e98b-48a4-825e-ffb99738059f-kube-api-access-tfkpc\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.758124 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41a313ea-79bf-4dd0-b8b9-37985dd58162\") pod \"openstack-galera-0\" (UID: \"dbb5263b-e98b-48a4-825e-ffb99738059f\") " pod="openstack/openstack-galera-0" Jan 05 14:06:37 crc kubenswrapper[4740]: I0105 14:06:37.832312 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 05 14:06:38 crc kubenswrapper[4740]: I0105 14:06:38.013640 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eeb4c870-b0d8-4d92-82c1-aedb35200c4b","Type":"ContainerStarted","Data":"fb4dceae4dea151c88f744052f4169082814f68c78a72e4e8329fe08490ab23f"} Jan 05 14:06:38 crc kubenswrapper[4740]: I0105 14:06:38.017445 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299","Type":"ContainerStarted","Data":"253c93cf4975e48e292f3708d36f0d8c41fe5dfc3d05db513a6a0174493b358e"} Jan 05 14:06:38 crc kubenswrapper[4740]: I0105 14:06:38.020005 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"54c80dba-b90f-4288-a366-4ff77f76db22","Type":"ContainerStarted","Data":"a27ec4ba6f23396161006fb1834eb46af6199f739eac243960fc45cb043e187b"} Jan 05 14:06:38 crc kubenswrapper[4740]: I0105 14:06:38.023537 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4396968c-d77b-434d-888f-3ab578514bbe","Type":"ContainerStarted","Data":"393e2fc8f51515016829a7510a15cb7318f43a9f29dcadd9b073868599335d61"} Jan 05 14:06:38 crc kubenswrapper[4740]: I0105 14:06:38.466952 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.010796 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.021436 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.022588 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.030991 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4g88r" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.031107 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.031270 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.037895 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.082535 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbb5263b-e98b-48a4-825e-ffb99738059f","Type":"ContainerStarted","Data":"6bbcda541d10b0c0734af50be9a1eb6200bd2d8cb8f2f2c9d412a9f836c3416f"} Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115012 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115098 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115133 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261c319a-37da-4987-a774-ecc24fa6b083-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115168 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115188 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115217 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95mrd\" (UniqueName: \"kubernetes.io/projected/261c319a-37da-4987-a774-ecc24fa6b083-kube-api-access-95mrd\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/261c319a-37da-4987-a774-ecc24fa6b083-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.115290 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/261c319a-37da-4987-a774-ecc24fa6b083-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217385 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217670 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217697 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261c319a-37da-4987-a774-ecc24fa6b083-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217731 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217753 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217783 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95mrd\" (UniqueName: \"kubernetes.io/projected/261c319a-37da-4987-a774-ecc24fa6b083-kube-api-access-95mrd\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217823 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/261c319a-37da-4987-a774-ecc24fa6b083-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.217861 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/261c319a-37da-4987-a774-ecc24fa6b083-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.218266 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/261c319a-37da-4987-a774-ecc24fa6b083-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.219666 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.220413 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.223160 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/261c319a-37da-4987-a774-ecc24fa6b083-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.236353 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/261c319a-37da-4987-a774-ecc24fa6b083-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.236712 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261c319a-37da-4987-a774-ecc24fa6b083-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.253739 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95mrd\" (UniqueName: \"kubernetes.io/projected/261c319a-37da-4987-a774-ecc24fa6b083-kube-api-access-95mrd\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.264995 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.265031 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cbd575d21d6737a27e160c68ec5126ff8402b0802ad746529cd2dea3f6fab098/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.341605 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.346934 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.351004 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fqprc" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.351205 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.351269 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.364376 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.414734 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43a7b8c6-1cd0-4394-97b9-d343c7039bb4\") pod \"openstack-cell1-galera-0\" (UID: \"261c319a-37da-4987-a774-ecc24fa6b083\") " pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.420817 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a8fe1-f0b3-4b24-9496-ee5c800200d8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.420902 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/204a8fe1-f0b3-4b24-9496-ee5c800200d8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.420963 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/204a8fe1-f0b3-4b24-9496-ee5c800200d8-kolla-config\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.421023 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/204a8fe1-f0b3-4b24-9496-ee5c800200d8-config-data\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.421055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j89\" (UniqueName: \"kubernetes.io/projected/204a8fe1-f0b3-4b24-9496-ee5c800200d8-kube-api-access-52j89\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.524768 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/204a8fe1-f0b3-4b24-9496-ee5c800200d8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.524843 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/204a8fe1-f0b3-4b24-9496-ee5c800200d8-kolla-config\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.524921 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/204a8fe1-f0b3-4b24-9496-ee5c800200d8-config-data\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.524966 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52j89\" (UniqueName: \"kubernetes.io/projected/204a8fe1-f0b3-4b24-9496-ee5c800200d8-kube-api-access-52j89\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.525042 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a8fe1-f0b3-4b24-9496-ee5c800200d8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.527044 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/204a8fe1-f0b3-4b24-9496-ee5c800200d8-config-data\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.527135 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/204a8fe1-f0b3-4b24-9496-ee5c800200d8-kolla-config\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.530397 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/204a8fe1-f0b3-4b24-9496-ee5c800200d8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.543999 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52j89\" (UniqueName: \"kubernetes.io/projected/204a8fe1-f0b3-4b24-9496-ee5c800200d8-kube-api-access-52j89\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.555812 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a8fe1-f0b3-4b24-9496-ee5c800200d8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"204a8fe1-f0b3-4b24-9496-ee5c800200d8\") " pod="openstack/memcached-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.674357 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 05 14:06:39 crc kubenswrapper[4740]: I0105 14:06:39.733470 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 05 14:06:40 crc kubenswrapper[4740]: I0105 14:06:40.832720 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 05 14:06:40 crc kubenswrapper[4740]: I0105 14:06:40.852207 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 05 14:06:40 crc kubenswrapper[4740]: W0105 14:06:40.867799 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod261c319a_37da_4987_a774_ecc24fa6b083.slice/crio-406dc220b80b75e95643f5b8542c909ac1e294ac0d0b5c12877ab50a14d4a147 WatchSource:0}: Error finding container 406dc220b80b75e95643f5b8542c909ac1e294ac0d0b5c12877ab50a14d4a147: Status 404 returned error can't find the container with id 406dc220b80b75e95643f5b8542c909ac1e294ac0d0b5c12877ab50a14d4a147 Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.182557 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"261c319a-37da-4987-a774-ecc24fa6b083","Type":"ContainerStarted","Data":"406dc220b80b75e95643f5b8542c909ac1e294ac0d0b5c12877ab50a14d4a147"} Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.214438 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"204a8fe1-f0b3-4b24-9496-ee5c800200d8","Type":"ContainerStarted","Data":"903b44d6ec56d491364844cee36be70b3ac2a2852a9eb784f2dd2958dd606771"} Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.269000 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.270336 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.275821 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dcnfn" Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.326155 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.382307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smc9r\" (UniqueName: \"kubernetes.io/projected/81930e93-1484-4b87-9aeb-f05bd0de40b5-kube-api-access-smc9r\") pod \"kube-state-metrics-0\" (UID: \"81930e93-1484-4b87-9aeb-f05bd0de40b5\") " pod="openstack/kube-state-metrics-0" Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.484236 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smc9r\" (UniqueName: \"kubernetes.io/projected/81930e93-1484-4b87-9aeb-f05bd0de40b5-kube-api-access-smc9r\") pod \"kube-state-metrics-0\" (UID: \"81930e93-1484-4b87-9aeb-f05bd0de40b5\") " pod="openstack/kube-state-metrics-0" Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.551664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smc9r\" (UniqueName: \"kubernetes.io/projected/81930e93-1484-4b87-9aeb-f05bd0de40b5-kube-api-access-smc9r\") pod \"kube-state-metrics-0\" (UID: \"81930e93-1484-4b87-9aeb-f05bd0de40b5\") " pod="openstack/kube-state-metrics-0" Jan 05 14:06:41 crc kubenswrapper[4740]: I0105 14:06:41.642863 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.217928 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.239172 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.247523 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-vdtsd" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.247702 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.262542 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.335766 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn7pv\" (UniqueName: \"kubernetes.io/projected/7d16bb19-c116-4398-9475-d7dbcfee470a-kube-api-access-nn7pv\") pod \"observability-ui-dashboards-66cbf594b5-p96xs\" (UID: \"7d16bb19-c116-4398-9475-d7dbcfee470a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.335896 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d16bb19-c116-4398-9475-d7dbcfee470a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-p96xs\" (UID: \"7d16bb19-c116-4398-9475-d7dbcfee470a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.449928 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d16bb19-c116-4398-9475-d7dbcfee470a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-p96xs\" (UID: \"7d16bb19-c116-4398-9475-d7dbcfee470a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.450139 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn7pv\" (UniqueName: \"kubernetes.io/projected/7d16bb19-c116-4398-9475-d7dbcfee470a-kube-api-access-nn7pv\") pod \"observability-ui-dashboards-66cbf594b5-p96xs\" (UID: \"7d16bb19-c116-4398-9475-d7dbcfee470a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.490215 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d16bb19-c116-4398-9475-d7dbcfee470a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-p96xs\" (UID: \"7d16bb19-c116-4398-9475-d7dbcfee470a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.491356 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn7pv\" (UniqueName: \"kubernetes.io/projected/7d16bb19-c116-4398-9475-d7dbcfee470a-kube-api-access-nn7pv\") pod \"observability-ui-dashboards-66cbf594b5-p96xs\" (UID: \"7d16bb19-c116-4398-9475-d7dbcfee470a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.535801 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.546502 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.557119 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.564008 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.576619 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.577276 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.577479 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.577622 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.577801 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-b78ns" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.577967 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.595499 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-78554f9b97-scjhq"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.598256 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.600407 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.619145 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.657146 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78554f9b97-scjhq"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.658892 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.658937 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-oauth-serving-cert\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.658964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.658994 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-trusted-ca-bundle\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-oauth-config\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659076 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659101 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-config\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659123 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659139 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjv2r\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-kube-api-access-hjv2r\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659172 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659226 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-serving-cert\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659254 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659279 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659310 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-service-ca\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659334 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659353 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnf6\" (UniqueName: \"kubernetes.io/projected/322864a4-da88-47f5-9d44-89a38dd2d8f3-kube-api-access-zdnf6\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.659390 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.721780 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.761330 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.761390 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-oauth-serving-cert\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.761413 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.761469 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-trusted-ca-bundle\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.761500 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-oauth-config\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762601 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762632 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-config\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762654 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762673 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjv2r\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-kube-api-access-hjv2r\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762701 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762744 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-serving-cert\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762771 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762794 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762828 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-service-ca\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762849 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762870 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnf6\" (UniqueName: \"kubernetes.io/projected/322864a4-da88-47f5-9d44-89a38dd2d8f3-kube-api-access-zdnf6\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762902 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.762947 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-trusted-ca-bundle\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.763623 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-config\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.764410 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.765390 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-oauth-serving-cert\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.766645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/322864a4-da88-47f5-9d44-89a38dd2d8f3-service-ca\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.766662 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.767275 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.770138 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.771140 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.771797 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-serving-cert\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.780530 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/322864a4-da88-47f5-9d44-89a38dd2d8f3-console-oauth-config\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.788836 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.788896 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4467b90e9c5a99a8934af67251ce6e91a7d02d2690b27a0634fda785dce76400/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.792561 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.795660 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.796386 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnf6\" (UniqueName: \"kubernetes.io/projected/322864a4-da88-47f5-9d44-89a38dd2d8f3-kube-api-access-zdnf6\") pod \"console-78554f9b97-scjhq\" (UID: \"322864a4-da88-47f5-9d44-89a38dd2d8f3\") " pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.803825 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.804106 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjv2r\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-kube-api-access-hjv2r\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: W0105 14:06:42.816514 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81930e93_1484_4b87_9aeb_f05bd0de40b5.slice/crio-5748fa50e42eb88beb38468614598425fdc88467bf853e4b3c4a433d12a2795d WatchSource:0}: Error finding container 5748fa50e42eb88beb38468614598425fdc88467bf853e4b3c4a433d12a2795d: Status 404 returned error can't find the container with id 5748fa50e42eb88beb38468614598425fdc88467bf853e4b3c4a433d12a2795d Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.877376 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.925042 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 05 14:06:42 crc kubenswrapper[4740]: I0105 14:06:42.953575 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:06:43 crc kubenswrapper[4740]: I0105 14:06:43.313898 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"81930e93-1484-4b87-9aeb-f05bd0de40b5","Type":"ContainerStarted","Data":"5748fa50e42eb88beb38468614598425fdc88467bf853e4b3c4a433d12a2795d"} Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.450869 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hr9pm"] Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.452367 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.457282 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4qsrf" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.457460 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.457691 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.516648 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hr9pm"] Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.569807 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zgtw8"] Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.578287 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605429 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-run\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605500 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-etc-ovs\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605550 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da00ff66-0241-449f-9ceb-9c9849d5f646-scripts\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605610 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s9qx\" (UniqueName: \"kubernetes.io/projected/da00ff66-0241-449f-9ceb-9c9849d5f646-kube-api-access-6s9qx\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605652 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-log\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605774 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00ff66-0241-449f-9ceb-9c9849d5f646-combined-ca-bundle\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605822 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a6073e2-c1cc-4ecb-982c-7e872257e83d-scripts\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.605947 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-run-ovn\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.606003 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/da00ff66-0241-449f-9ceb-9c9849d5f646-ovn-controller-tls-certs\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.606031 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-run\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.606046 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnqt\" (UniqueName: \"kubernetes.io/projected/9a6073e2-c1cc-4ecb-982c-7e872257e83d-kube-api-access-zhnqt\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.606127 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-log-ovn\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.606189 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-lib\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.621192 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zgtw8"] Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.709769 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-log\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.709935 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00ff66-0241-449f-9ceb-9c9849d5f646-combined-ca-bundle\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710029 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a6073e2-c1cc-4ecb-982c-7e872257e83d-scripts\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710090 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-run-ovn\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710134 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/da00ff66-0241-449f-9ceb-9c9849d5f646-ovn-controller-tls-certs\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710170 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-run\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710202 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnqt\" (UniqueName: \"kubernetes.io/projected/9a6073e2-c1cc-4ecb-982c-7e872257e83d-kube-api-access-zhnqt\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710255 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-log-ovn\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710340 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-lib\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710388 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-run\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710431 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-etc-ovs\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da00ff66-0241-449f-9ceb-9c9849d5f646-scripts\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.710559 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s9qx\" (UniqueName: \"kubernetes.io/projected/da00ff66-0241-449f-9ceb-9c9849d5f646-kube-api-access-6s9qx\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.711250 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-run\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.711414 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-log\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.712311 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-log-ovn\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.712517 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-lib\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.712575 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-var-run\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.712732 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a6073e2-c1cc-4ecb-982c-7e872257e83d-etc-ovs\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.716967 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da00ff66-0241-449f-9ceb-9c9849d5f646-var-run-ovn\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.717350 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da00ff66-0241-449f-9ceb-9c9849d5f646-scripts\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.723919 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a6073e2-c1cc-4ecb-982c-7e872257e83d-scripts\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.729589 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/da00ff66-0241-449f-9ceb-9c9849d5f646-ovn-controller-tls-certs\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.733645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s9qx\" (UniqueName: \"kubernetes.io/projected/da00ff66-0241-449f-9ceb-9c9849d5f646-kube-api-access-6s9qx\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.733701 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00ff66-0241-449f-9ceb-9c9849d5f646-combined-ca-bundle\") pod \"ovn-controller-hr9pm\" (UID: \"da00ff66-0241-449f-9ceb-9c9849d5f646\") " pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.733650 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnqt\" (UniqueName: \"kubernetes.io/projected/9a6073e2-c1cc-4ecb-982c-7e872257e83d-kube-api-access-zhnqt\") pod \"ovn-controller-ovs-zgtw8\" (UID: \"9a6073e2-c1cc-4ecb-982c-7e872257e83d\") " pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.801603 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm" Jan 05 14:06:44 crc kubenswrapper[4740]: I0105 14:06:44.900879 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.084614 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.087297 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.093376 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.093446 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fjkzk" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.094402 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.095336 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.095982 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.103902 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.131121 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81bea285-8670-4e86-a0a0-df327d8cf009-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.131179 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.232987 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81bea285-8670-4e86-a0a0-df327d8cf009-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233096 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233145 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233225 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstrk\" (UniqueName: \"kubernetes.io/projected/81bea285-8670-4e86-a0a0-df327d8cf009-kube-api-access-tstrk\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233305 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81bea285-8670-4e86-a0a0-df327d8cf009-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233362 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bea285-8670-4e86-a0a0-df327d8cf009-config\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233418 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.233437 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.234054 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81bea285-8670-4e86-a0a0-df327d8cf009-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.242577 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.242627 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/76fdc91fb77c906af3f5814d8c3aa6bc19ad13fb6a59f3c45bc8740a6858de43/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.286657 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f343b32-e33e-48f1-929d-a31f8b5a9b6f\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335176 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335262 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tstrk\" (UniqueName: \"kubernetes.io/projected/81bea285-8670-4e86-a0a0-df327d8cf009-kube-api-access-tstrk\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335307 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81bea285-8670-4e86-a0a0-df327d8cf009-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335340 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bea285-8670-4e86-a0a0-df327d8cf009-config\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335379 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.335932 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81bea285-8670-4e86-a0a0-df327d8cf009-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.336902 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bea285-8670-4e86-a0a0-df327d8cf009-config\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.347187 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.348336 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.351953 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bea285-8670-4e86-a0a0-df327d8cf009-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.352382 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstrk\" (UniqueName: \"kubernetes.io/projected/81bea285-8670-4e86-a0a0-df327d8cf009-kube-api-access-tstrk\") pod \"ovsdbserver-nb-0\" (UID: \"81bea285-8670-4e86-a0a0-df327d8cf009\") " pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:45 crc kubenswrapper[4740]: I0105 14:06:45.429360 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.676617 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.678625 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.680555 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.682616 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ghd7d" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.683797 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.686047 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.699798 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829580 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829635 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66d1fba-b858-4c18-9ddc-d329614afe92-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829653 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/b66d1fba-b858-4c18-9ddc-d329614afe92-kube-api-access-wjqqq\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829772 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829841 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829874 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b66d1fba-b858-4c18-9ddc-d329614afe92-config\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829938 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b66d1fba-b858-4c18-9ddc-d329614afe92-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.829993 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932351 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932444 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932475 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b66d1fba-b858-4c18-9ddc-d329614afe92-config\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932537 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b66d1fba-b858-4c18-9ddc-d329614afe92-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932594 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932671 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932710 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66d1fba-b858-4c18-9ddc-d329614afe92-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.932732 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/b66d1fba-b858-4c18-9ddc-d329614afe92-kube-api-access-wjqqq\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.933156 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b66d1fba-b858-4c18-9ddc-d329614afe92-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.933500 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b66d1fba-b858-4c18-9ddc-d329614afe92-config\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.934159 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b66d1fba-b858-4c18-9ddc-d329614afe92-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.936643 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.936688 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d315a2690335af0c8d1fd681dc3ac0aa8976204cc93026575cb0025cb5fd47b/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.944022 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.944144 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.945109 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66d1fba-b858-4c18-9ddc-d329614afe92-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.965350 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqqq\" (UniqueName: \"kubernetes.io/projected/b66d1fba-b858-4c18-9ddc-d329614afe92-kube-api-access-wjqqq\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:48 crc kubenswrapper[4740]: I0105 14:06:48.974695 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b043bf6-bbec-45b8-abcf-5fbc3b022fba\") pod \"ovsdbserver-sb-0\" (UID: \"b66d1fba-b858-4c18-9ddc-d329614afe92\") " pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:49 crc kubenswrapper[4740]: I0105 14:06:49.004397 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 05 14:06:54 crc kubenswrapper[4740]: I0105 14:06:54.772132 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs"] Jan 05 14:07:01 crc kubenswrapper[4740]: E0105 14:07:01.690292 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 05 14:07:01 crc kubenswrapper[4740]: E0105 14:07:01.691869 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(dbb5263b-e98b-48a4-825e-ffb99738059f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:01 crc kubenswrapper[4740]: E0105 14:07:01.693450 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" Jan 05 14:07:01 crc kubenswrapper[4740]: I0105 14:07:01.916581 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:07:01 crc kubenswrapper[4740]: I0105 14:07:01.916670 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:07:01 crc kubenswrapper[4740]: I0105 14:07:01.916774 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:07:01 crc kubenswrapper[4740]: I0105 14:07:01.917909 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:07:01 crc kubenswrapper[4740]: I0105 14:07:01.918014 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d" gracePeriod=600 Jan 05 14:07:02 crc kubenswrapper[4740]: I0105 14:07:02.549351 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d" exitCode=0 Jan 05 14:07:02 crc kubenswrapper[4740]: I0105 14:07:02.550477 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d"} Jan 05 14:07:02 crc kubenswrapper[4740]: I0105 14:07:02.550540 4740 scope.go:117] "RemoveContainer" containerID="615f01064ee45ac723f59788df185ab69f9b600b12c150fcc649dcf97daf611a" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.556720 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.644595 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.644735 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxvnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-2_openstack(eeb4c870-b0d8-4d92-82c1-aedb35200c4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.645834 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.651471 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.651679 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95mrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(261c319a-37da-4987-a774-ecc24fa6b083): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:02 crc kubenswrapper[4740]: E0105 14:07:02.652750 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.338998 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.339782 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n9fh58fhf6h58dh57h64ch648hc9h548h549h75h5d7h77hcfh8fhd9h94h76h6hfch5f4hd4h669hf5h679h658h64chddhb4h7bh588h58fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52j89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(204a8fe1-f0b3-4b24-9496-ee5c800200d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.341120 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="204a8fe1-f0b3-4b24-9496-ee5c800200d8" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.358883 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.359049 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k26n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-1_openstack(4396968c-d77b-434d-888f-3ab578514bbe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.359589 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.359763 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qchns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(3bc8ef80-7c3c-4e61-8eaf-294ab4a75299): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.360831 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-1" podUID="4396968c-d77b-434d-888f-3ab578514bbe" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.360889 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.392828 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.393041 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m9b7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(54c80dba-b90f-4288-a366-4ff77f76db22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.396466 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" Jan 05 14:07:03 crc kubenswrapper[4740]: I0105 14:07:03.562397 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" event={"ID":"7d16bb19-c116-4398-9475-d7dbcfee470a","Type":"ContainerStarted","Data":"85b8d204f27313370cc4637d69c3851e653b47a64670381cfc4bf1b0af4c6ec6"} Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.564391 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="204a8fe1-f0b3-4b24-9496-ee5c800200d8" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.566189 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.567234 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-1" podUID="4396968c-d77b-434d-888f-3ab578514bbe" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.567503 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.568807 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" Jan 05 14:07:03 crc kubenswrapper[4740]: E0105 14:07:03.576441 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.407434 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.407910 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7nrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rf57k_openstack(4b7856c9-a723-48a1-bf77-d39e78f3d62c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.409841 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" podUID="4b7856c9-a723-48a1-bf77-d39e78f3d62c" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.414742 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.414890 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkdwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-bs4mf_openstack(a8a0f7e4-fb87-44e1-aa94-af4ac6db619e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.416090 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" podUID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.428356 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.428525 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t5gf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-7fx9n_openstack(4b270e1a-8c5f-4c3f-b070-afa15ee2cda9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.430470 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" podUID="4b270e1a-8c5f-4c3f-b070-afa15ee2cda9" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.450734 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.450922 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfbf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-wvwwn_openstack(1ff1f33b-0049-4089-9745-38a3da10de80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.452936 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" podUID="1ff1f33b-0049-4089-9745-38a3da10de80" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.592648 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" podUID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" Jan 05 14:07:04 crc kubenswrapper[4740]: E0105 14:07:04.600712 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" podUID="4b270e1a-8c5f-4c3f-b070-afa15ee2cda9" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.355752 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.462615 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78554f9b97-scjhq"] Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.472746 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hr9pm"] Jan 05 14:07:05 crc kubenswrapper[4740]: W0105 14:07:05.497206 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb66d1fba_b858_4c18_9ddc_d329614afe92.slice/crio-2ae60166151d9cb2b98f896e6d8f4447e94c8205bbe1dd645ce9deb04b0c3447 WatchSource:0}: Error finding container 2ae60166151d9cb2b98f896e6d8f4447e94c8205bbe1dd645ce9deb04b0c3447: Status 404 returned error can't find the container with id 2ae60166151d9cb2b98f896e6d8f4447e94c8205bbe1dd645ce9deb04b0c3447 Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.544839 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.597881 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" event={"ID":"4b7856c9-a723-48a1-bf77-d39e78f3d62c","Type":"ContainerDied","Data":"51810b311715d16418bcd0937fcf07dfab90b6280c486505f4e56ad27758d4f6"} Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.598167 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51810b311715d16418bcd0937fcf07dfab90b6280c486505f4e56ad27758d4f6" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.600937 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"7164e8ec74a1f47d6179acf6f6c20f4c18a05f4cce20c982561062888c48c311"} Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.602448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78554f9b97-scjhq" event={"ID":"322864a4-da88-47f5-9d44-89a38dd2d8f3","Type":"ContainerStarted","Data":"94480d1ecd46826e93804dcfdb287e03552cde6c4ac97ce9a705fe9df2affa05"} Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.608727 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b66d1fba-b858-4c18-9ddc-d329614afe92","Type":"ContainerStarted","Data":"2ae60166151d9cb2b98f896e6d8f4447e94c8205bbe1dd645ce9deb04b0c3447"} Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.610577 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.611502 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" event={"ID":"1ff1f33b-0049-4089-9745-38a3da10de80","Type":"ContainerDied","Data":"a315cc2d7a96339923e84d2486836784591f978a9bdb1e6a8942eb7767f5b91b"} Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.629552 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.724905 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfbf5\" (UniqueName: \"kubernetes.io/projected/1ff1f33b-0049-4089-9745-38a3da10de80-kube-api-access-hfbf5\") pod \"1ff1f33b-0049-4089-9745-38a3da10de80\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.724965 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-config\") pod \"1ff1f33b-0049-4089-9745-38a3da10de80\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.725044 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7856c9-a723-48a1-bf77-d39e78f3d62c-config\") pod \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.725102 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-dns-svc\") pod \"1ff1f33b-0049-4089-9745-38a3da10de80\" (UID: \"1ff1f33b-0049-4089-9745-38a3da10de80\") " Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.725232 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nrd\" (UniqueName: \"kubernetes.io/projected/4b7856c9-a723-48a1-bf77-d39e78f3d62c-kube-api-access-z7nrd\") pod \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\" (UID: \"4b7856c9-a723-48a1-bf77-d39e78f3d62c\") " Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.726456 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-config" (OuterVolumeSpecName: "config") pod "1ff1f33b-0049-4089-9745-38a3da10de80" (UID: "1ff1f33b-0049-4089-9745-38a3da10de80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.727968 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b7856c9-a723-48a1-bf77-d39e78f3d62c-config" (OuterVolumeSpecName: "config") pod "4b7856c9-a723-48a1-bf77-d39e78f3d62c" (UID: "4b7856c9-a723-48a1-bf77-d39e78f3d62c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.728744 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ff1f33b-0049-4089-9745-38a3da10de80" (UID: "1ff1f33b-0049-4089-9745-38a3da10de80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.731521 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff1f33b-0049-4089-9745-38a3da10de80-kube-api-access-hfbf5" (OuterVolumeSpecName: "kube-api-access-hfbf5") pod "1ff1f33b-0049-4089-9745-38a3da10de80" (UID: "1ff1f33b-0049-4089-9745-38a3da10de80"). InnerVolumeSpecName "kube-api-access-hfbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.732292 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7856c9-a723-48a1-bf77-d39e78f3d62c-kube-api-access-z7nrd" (OuterVolumeSpecName: "kube-api-access-z7nrd") pod "4b7856c9-a723-48a1-bf77-d39e78f3d62c" (UID: "4b7856c9-a723-48a1-bf77-d39e78f3d62c"). InnerVolumeSpecName "kube-api-access-z7nrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.732690 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.828555 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfbf5\" (UniqueName: \"kubernetes.io/projected/1ff1f33b-0049-4089-9745-38a3da10de80-kube-api-access-hfbf5\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.828594 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.828603 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7856c9-a723-48a1-bf77-d39e78f3d62c-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.828615 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff1f33b-0049-4089-9745-38a3da10de80-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.828626 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7nrd\" (UniqueName: \"kubernetes.io/projected/4b7856c9-a723-48a1-bf77-d39e78f3d62c-kube-api-access-z7nrd\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:05 crc kubenswrapper[4740]: I0105 14:07:05.869025 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zgtw8"] Jan 05 14:07:06 crc kubenswrapper[4740]: W0105 14:07:06.126400 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa6a7e6_018b_43df_8084_48e1aa10e2ca.slice/crio-f9684b14067f4469a6f9b57c3e3dc1bc93cc9225d3789de988c42ab411787cf7 WatchSource:0}: Error finding container f9684b14067f4469a6f9b57c3e3dc1bc93cc9225d3789de988c42ab411787cf7: Status 404 returned error can't find the container with id f9684b14067f4469a6f9b57c3e3dc1bc93cc9225d3789de988c42ab411787cf7 Jan 05 14:07:06 crc kubenswrapper[4740]: W0105 14:07:06.140008 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bea285_8670_4e86_a0a0_df327d8cf009.slice/crio-1005a94eca4efe0fe8beff664355be04de95c2a4bf91c705b4e250ad0611cebe WatchSource:0}: Error finding container 1005a94eca4efe0fe8beff664355be04de95c2a4bf91c705b4e250ad0611cebe: Status 404 returned error can't find the container with id 1005a94eca4efe0fe8beff664355be04de95c2a4bf91c705b4e250ad0611cebe Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.628475 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zgtw8" event={"ID":"9a6073e2-c1cc-4ecb-982c-7e872257e83d","Type":"ContainerStarted","Data":"85eb4d7665328767c1302d15f6e78d96fb9ce214b958c19112091e3425105486"} Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.629941 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerStarted","Data":"f9684b14067f4469a6f9b57c3e3dc1bc93cc9225d3789de988c42ab411787cf7"} Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.631148 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"81bea285-8670-4e86-a0a0-df327d8cf009","Type":"ContainerStarted","Data":"1005a94eca4efe0fe8beff664355be04de95c2a4bf91c705b4e250ad0611cebe"} Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.632285 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm" event={"ID":"da00ff66-0241-449f-9ceb-9c9849d5f646","Type":"ContainerStarted","Data":"42cb0103ac77b93580bb93bce19f5e54a7cff2dea44c20151bec1bb9f5b94bf2"} Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.632305 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-wvwwn" Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.632292 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rf57k" Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.699882 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wvwwn"] Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.707976 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-wvwwn"] Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.745451 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf57k"] Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.752858 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rf57k"] Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.989655 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff1f33b-0049-4089-9745-38a3da10de80" path="/var/lib/kubelet/pods/1ff1f33b-0049-4089-9745-38a3da10de80/volumes" Jan 05 14:07:06 crc kubenswrapper[4740]: I0105 14:07:06.990353 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7856c9-a723-48a1-bf77-d39e78f3d62c" path="/var/lib/kubelet/pods/4b7856c9-a723-48a1-bf77-d39e78f3d62c/volumes" Jan 05 14:07:07 crc kubenswrapper[4740]: I0105 14:07:07.659457 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78554f9b97-scjhq" event={"ID":"322864a4-da88-47f5-9d44-89a38dd2d8f3","Type":"ContainerStarted","Data":"e77de1564485e8600e72654f9aca467222668d83c8166db4083bd9febcd24189"} Jan 05 14:07:07 crc kubenswrapper[4740]: I0105 14:07:07.678888 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78554f9b97-scjhq" podStartSLOduration=25.678870149 podStartE2EDuration="25.678870149s" podCreationTimestamp="2026-01-05 14:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:07.678767656 +0000 UTC m=+1076.985676225" watchObservedRunningTime="2026-01-05 14:07:07.678870149 +0000 UTC m=+1076.985778728" Jan 05 14:07:08 crc kubenswrapper[4740]: I0105 14:07:08.668174 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"81930e93-1484-4b87-9aeb-f05bd0de40b5","Type":"ContainerStarted","Data":"4c5137acd640422db442140325167da8d9c8a551020f5a81f47596086ad81bd4"} Jan 05 14:07:08 crc kubenswrapper[4740]: I0105 14:07:08.668552 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 05 14:07:08 crc kubenswrapper[4740]: I0105 14:07:08.672654 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" event={"ID":"7d16bb19-c116-4398-9475-d7dbcfee470a","Type":"ContainerStarted","Data":"99cbcf024b5ce59eed24b2dc85071629b8b69c2a03ce42b662d9e97609bf96ab"} Jan 05 14:07:08 crc kubenswrapper[4740]: I0105 14:07:08.702119 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.055848435 podStartE2EDuration="27.702102132s" podCreationTimestamp="2026-01-05 14:06:41 +0000 UTC" firstStartedPulling="2026-01-05 14:06:42.823494842 +0000 UTC m=+1052.130403421" lastFinishedPulling="2026-01-05 14:07:07.469748509 +0000 UTC m=+1076.776657118" observedRunningTime="2026-01-05 14:07:08.688599222 +0000 UTC m=+1077.995507811" watchObservedRunningTime="2026-01-05 14:07:08.702102132 +0000 UTC m=+1078.009010711" Jan 05 14:07:08 crc kubenswrapper[4740]: I0105 14:07:08.710278 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-p96xs" podStartSLOduration=23.685525102 podStartE2EDuration="26.71025789s" podCreationTimestamp="2026-01-05 14:06:42 +0000 UTC" firstStartedPulling="2026-01-05 14:07:03.385404134 +0000 UTC m=+1072.692312753" lastFinishedPulling="2026-01-05 14:07:06.410136932 +0000 UTC m=+1075.717045541" observedRunningTime="2026-01-05 14:07:08.702541585 +0000 UTC m=+1078.009450164" watchObservedRunningTime="2026-01-05 14:07:08.71025789 +0000 UTC m=+1078.017166469" Jan 05 14:07:10 crc kubenswrapper[4740]: I0105 14:07:10.694626 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zgtw8" event={"ID":"9a6073e2-c1cc-4ecb-982c-7e872257e83d","Type":"ContainerStarted","Data":"339dfa6553f3cf9be3d1d3dde7788f740a1059733ad8e2ed104288cdea0941e7"} Jan 05 14:07:10 crc kubenswrapper[4740]: I0105 14:07:10.697222 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"81bea285-8670-4e86-a0a0-df327d8cf009","Type":"ContainerStarted","Data":"2c6afc5f7426b1fef800a9f90c1fde118dc23f2aade8bfb8d28c56b2d56ee2d5"} Jan 05 14:07:10 crc kubenswrapper[4740]: I0105 14:07:10.699348 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm" event={"ID":"da00ff66-0241-449f-9ceb-9c9849d5f646","Type":"ContainerStarted","Data":"56aa5fbca1262ba3d9c13e400dd9c5c3afd2081631514c1c4972bccd2355b72f"} Jan 05 14:07:10 crc kubenswrapper[4740]: I0105 14:07:10.699477 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hr9pm" Jan 05 14:07:10 crc kubenswrapper[4740]: I0105 14:07:10.701257 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b66d1fba-b858-4c18-9ddc-d329614afe92","Type":"ContainerStarted","Data":"79cc22fc56836cef55296547e7fd5372b15b5bd75437de35aad6069c63ee944c"} Jan 05 14:07:10 crc kubenswrapper[4740]: I0105 14:07:10.739858 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hr9pm" podStartSLOduration=22.536990284 podStartE2EDuration="26.739810466s" podCreationTimestamp="2026-01-05 14:06:44 +0000 UTC" firstStartedPulling="2026-01-05 14:07:06.106339681 +0000 UTC m=+1075.413248260" lastFinishedPulling="2026-01-05 14:07:10.309159863 +0000 UTC m=+1079.616068442" observedRunningTime="2026-01-05 14:07:10.732849459 +0000 UTC m=+1080.039758088" watchObservedRunningTime="2026-01-05 14:07:10.739810466 +0000 UTC m=+1080.046719065" Jan 05 14:07:11 crc kubenswrapper[4740]: I0105 14:07:11.713646 4740 generic.go:334] "Generic (PLEG): container finished" podID="9a6073e2-c1cc-4ecb-982c-7e872257e83d" containerID="339dfa6553f3cf9be3d1d3dde7788f740a1059733ad8e2ed104288cdea0941e7" exitCode=0 Jan 05 14:07:11 crc kubenswrapper[4740]: I0105 14:07:11.714139 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zgtw8" event={"ID":"9a6073e2-c1cc-4ecb-982c-7e872257e83d","Type":"ContainerDied","Data":"339dfa6553f3cf9be3d1d3dde7788f740a1059733ad8e2ed104288cdea0941e7"} Jan 05 14:07:12 crc kubenswrapper[4740]: I0105 14:07:12.729940 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zgtw8" event={"ID":"9a6073e2-c1cc-4ecb-982c-7e872257e83d","Type":"ContainerStarted","Data":"152d2d0e8e913ef032d6678f100fdd1d3de43fdda078985727647bdc31e64291"} Jan 05 14:07:12 crc kubenswrapper[4740]: I0105 14:07:12.953978 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:07:12 crc kubenswrapper[4740]: I0105 14:07:12.954028 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:07:12 crc kubenswrapper[4740]: I0105 14:07:12.959595 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:07:13 crc kubenswrapper[4740]: I0105 14:07:13.751753 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 14:07:13 crc kubenswrapper[4740]: I0105 14:07:13.854573 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bbc7d56d-v8cvv"] Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.755244 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"81bea285-8670-4e86-a0a0-df327d8cf009","Type":"ContainerStarted","Data":"f17aa3eec01a89a7617556bf3bbd182a1028f8482d3182ee2d1c782e782c2b80"} Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.756720 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbb5263b-e98b-48a4-825e-ffb99738059f","Type":"ContainerStarted","Data":"14b51bc5cf0f6a9466bff6b637aaa821d11333c346faa4e07111305c47be26aa"} Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.758345 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b66d1fba-b858-4c18-9ddc-d329614afe92","Type":"ContainerStarted","Data":"ae17ea2de944d561961780b4301b9be8f7d7845fa0e5a4cb41efe48e8eb955be"} Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.760431 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zgtw8" event={"ID":"9a6073e2-c1cc-4ecb-982c-7e872257e83d","Type":"ContainerStarted","Data":"cb5ad07a887add0e42f925e0f7f50905df61264b9a721f42d0da94451fd9879c"} Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.760614 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.760658 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.762005 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerStarted","Data":"0b97cb304c7443a8f612ae0939ddd0a643468732302ad8bbdc9539057e6d4f8a"} Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.763743 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"204a8fe1-f0b3-4b24-9496-ee5c800200d8","Type":"ContainerStarted","Data":"4ac27c5bf5fd12bff4510f717834ded07752b37a03f733009b4f9c39f306d2cf"} Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.764054 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.774854 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.766667224 podStartE2EDuration="30.774837732s" podCreationTimestamp="2026-01-05 14:06:44 +0000 UTC" firstStartedPulling="2026-01-05 14:07:06.175432719 +0000 UTC m=+1075.482341298" lastFinishedPulling="2026-01-05 14:07:14.183603227 +0000 UTC m=+1083.490511806" observedRunningTime="2026-01-05 14:07:14.771359439 +0000 UTC m=+1084.078268038" watchObservedRunningTime="2026-01-05 14:07:14.774837732 +0000 UTC m=+1084.081746321" Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.797130 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zgtw8" podStartSLOduration=26.668599092 podStartE2EDuration="30.797111137s" podCreationTimestamp="2026-01-05 14:06:44 +0000 UTC" firstStartedPulling="2026-01-05 14:07:06.175833299 +0000 UTC m=+1075.482741878" lastFinishedPulling="2026-01-05 14:07:10.304345344 +0000 UTC m=+1079.611253923" observedRunningTime="2026-01-05 14:07:14.791028125 +0000 UTC m=+1084.097936714" watchObservedRunningTime="2026-01-05 14:07:14.797111137 +0000 UTC m=+1084.104019726" Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.854634 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.285299092 podStartE2EDuration="35.854620675s" podCreationTimestamp="2026-01-05 14:06:39 +0000 UTC" firstStartedPulling="2026-01-05 14:06:40.911233242 +0000 UTC m=+1050.218141821" lastFinishedPulling="2026-01-05 14:07:14.480554815 +0000 UTC m=+1083.787463404" observedRunningTime="2026-01-05 14:07:14.850613497 +0000 UTC m=+1084.157522066" watchObservedRunningTime="2026-01-05 14:07:14.854620675 +0000 UTC m=+1084.161529254" Jan 05 14:07:14 crc kubenswrapper[4740]: I0105 14:07:14.875769 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.207605077 podStartE2EDuration="27.875748499s" podCreationTimestamp="2026-01-05 14:06:47 +0000 UTC" firstStartedPulling="2026-01-05 14:07:05.522539554 +0000 UTC m=+1074.829448133" lastFinishedPulling="2026-01-05 14:07:14.190682976 +0000 UTC m=+1083.497591555" observedRunningTime="2026-01-05 14:07:14.868598499 +0000 UTC m=+1084.175507078" watchObservedRunningTime="2026-01-05 14:07:14.875748499 +0000 UTC m=+1084.182657078" Jan 05 14:07:15 crc kubenswrapper[4740]: I0105 14:07:15.429465 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 05 14:07:15 crc kubenswrapper[4740]: I0105 14:07:15.429913 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 05 14:07:15 crc kubenswrapper[4740]: I0105 14:07:15.499050 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 05 14:07:15 crc kubenswrapper[4740]: I0105 14:07:15.829950 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.004845 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.058823 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.113041 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bs4mf"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.137766 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmtb8"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.139931 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.142897 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.153239 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmtb8"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.161137 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.161221 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-config\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.161270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqcmn\" (UniqueName: \"kubernetes.io/projected/4e8caefd-2fb4-46aa-8d80-83376f1660bf-kube-api-access-sqcmn\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.161441 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.210093 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xbfj5"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.211729 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.214739 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.224149 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xbfj5"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263238 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263309 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ca95a0e-df45-403b-b278-5701175ac8e1-ovn-rundir\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263365 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca95a0e-df45-403b-b278-5701175ac8e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263409 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263441 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca95a0e-df45-403b-b278-5701175ac8e1-combined-ca-bundle\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263559 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-config\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263579 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqcmn\" (UniqueName: \"kubernetes.io/projected/4e8caefd-2fb4-46aa-8d80-83376f1660bf-kube-api-access-sqcmn\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263618 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ca95a0e-df45-403b-b278-5701175ac8e1-ovs-rundir\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.263994 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca95a0e-df45-403b-b278-5701175ac8e1-config\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.264032 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws5xq\" (UniqueName: \"kubernetes.io/projected/2ca95a0e-df45-403b-b278-5701175ac8e1-kube-api-access-ws5xq\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.264446 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.264485 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-config\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.265185 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.293942 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqcmn\" (UniqueName: \"kubernetes.io/projected/4e8caefd-2fb4-46aa-8d80-83376f1660bf-kube-api-access-sqcmn\") pod \"dnsmasq-dns-5bf47b49b7-tmtb8\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.367506 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ca95a0e-df45-403b-b278-5701175ac8e1-ovn-rundir\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.367698 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca95a0e-df45-403b-b278-5701175ac8e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.367866 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca95a0e-df45-403b-b278-5701175ac8e1-combined-ca-bundle\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.367955 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ca95a0e-df45-403b-b278-5701175ac8e1-ovs-rundir\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.368051 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca95a0e-df45-403b-b278-5701175ac8e1-config\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.368106 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws5xq\" (UniqueName: \"kubernetes.io/projected/2ca95a0e-df45-403b-b278-5701175ac8e1-kube-api-access-ws5xq\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.368411 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ca95a0e-df45-403b-b278-5701175ac8e1-ovs-rundir\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.368832 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ca95a0e-df45-403b-b278-5701175ac8e1-ovn-rundir\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.369199 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca95a0e-df45-403b-b278-5701175ac8e1-config\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.373169 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca95a0e-df45-403b-b278-5701175ac8e1-combined-ca-bundle\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.373209 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ca95a0e-df45-403b-b278-5701175ac8e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.386674 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws5xq\" (UniqueName: \"kubernetes.io/projected/2ca95a0e-df45-403b-b278-5701175ac8e1-kube-api-access-ws5xq\") pod \"ovn-controller-metrics-xbfj5\" (UID: \"2ca95a0e-df45-403b-b278-5701175ac8e1\") " pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.462977 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.543315 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xbfj5" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.599677 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7fx9n"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.639168 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-8wm86"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.641092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.642860 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.677052 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-dns-svc\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.677206 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.677595 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-config\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.677974 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.678308 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbqr\" (UniqueName: \"kubernetes.io/projected/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-kube-api-access-llbqr\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.697370 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8wm86"] Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.790357 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-config\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.790419 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.790495 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llbqr\" (UniqueName: \"kubernetes.io/projected/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-kube-api-access-llbqr\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.790572 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-dns-svc\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.790596 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.791385 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-config\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.791457 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.791937 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.792002 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-dns-svc\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.792248 4740 generic.go:334] "Generic (PLEG): container finished" podID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" containerID="7b1267804c5ccdc596aae17ec026c3ad3e951e9204f02aed410068a96342f251" exitCode=0 Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.792296 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" event={"ID":"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e","Type":"ContainerDied","Data":"7b1267804c5ccdc596aae17ec026c3ad3e951e9204f02aed410068a96342f251"} Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.792565 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.826804 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbqr\" (UniqueName: \"kubernetes.io/projected/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-kube-api-access-llbqr\") pod \"dnsmasq-dns-8554648995-8wm86\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:16 crc kubenswrapper[4740]: I0105 14:07:16.872000 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.000125 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.040041 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmtb8"] Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.118716 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.124186 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.127414 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.128886 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lft4w" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.129226 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.129816 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.134983 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313552 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf526b0a-998c-4943-bfba-04352421ed58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313817 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313854 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf526b0a-998c-4943-bfba-04352421ed58-config\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313903 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf526b0a-998c-4943-bfba-04352421ed58-scripts\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313921 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313955 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.313987 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msh59\" (UniqueName: \"kubernetes.io/projected/bf526b0a-998c-4943-bfba-04352421ed58-kube-api-access-msh59\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.364455 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xbfj5"] Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.416680 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msh59\" (UniqueName: \"kubernetes.io/projected/bf526b0a-998c-4943-bfba-04352421ed58-kube-api-access-msh59\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.416896 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf526b0a-998c-4943-bfba-04352421ed58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.417002 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.417112 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf526b0a-998c-4943-bfba-04352421ed58-config\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.417210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf526b0a-998c-4943-bfba-04352421ed58-scripts\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.417275 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.417425 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.417860 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf526b0a-998c-4943-bfba-04352421ed58-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.418139 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf526b0a-998c-4943-bfba-04352421ed58-config\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.418319 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf526b0a-998c-4943-bfba-04352421ed58-scripts\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.421674 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.421695 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.425299 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf526b0a-998c-4943-bfba-04352421ed58-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.437178 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msh59\" (UniqueName: \"kubernetes.io/projected/bf526b0a-998c-4943-bfba-04352421ed58-kube-api-access-msh59\") pod \"ovn-northd-0\" (UID: \"bf526b0a-998c-4943-bfba-04352421ed58\") " pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.467953 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.475624 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.493279 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-config\") pod \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620259 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-dns-svc\") pod \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620290 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5gf2\" (UniqueName: \"kubernetes.io/projected/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-kube-api-access-t5gf2\") pod \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620432 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-dns-svc\") pod \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620524 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-config\") pod \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\" (UID: \"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9\") " Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620556 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkdwx\" (UniqueName: \"kubernetes.io/projected/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-kube-api-access-mkdwx\") pod \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\" (UID: \"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e\") " Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.620923 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b270e1a-8c5f-4c3f-b070-afa15ee2cda9" (UID: "4b270e1a-8c5f-4c3f-b070-afa15ee2cda9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.621474 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-config" (OuterVolumeSpecName: "config") pod "4b270e1a-8c5f-4c3f-b070-afa15ee2cda9" (UID: "4b270e1a-8c5f-4c3f-b070-afa15ee2cda9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.621618 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.625552 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-kube-api-access-t5gf2" (OuterVolumeSpecName: "kube-api-access-t5gf2") pod "4b270e1a-8c5f-4c3f-b070-afa15ee2cda9" (UID: "4b270e1a-8c5f-4c3f-b070-afa15ee2cda9"). InnerVolumeSpecName "kube-api-access-t5gf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.626650 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-kube-api-access-mkdwx" (OuterVolumeSpecName: "kube-api-access-mkdwx") pod "a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" (UID: "a8a0f7e4-fb87-44e1-aa94-af4ac6db619e"). InnerVolumeSpecName "kube-api-access-mkdwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.697353 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8wm86"] Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.726367 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5gf2\" (UniqueName: \"kubernetes.io/projected/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-kube-api-access-t5gf2\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.726449 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.726472 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkdwx\" (UniqueName: \"kubernetes.io/projected/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-kube-api-access-mkdwx\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.827668 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"54c80dba-b90f-4288-a366-4ff77f76db22","Type":"ContainerStarted","Data":"1c047226985f396d19c238260e8d4fcd5bd4861cf84ef028ed887a1b0fa068b9"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.829298 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" event={"ID":"4b270e1a-8c5f-4c3f-b070-afa15ee2cda9","Type":"ContainerDied","Data":"ca78bb24854395cf67f9a0557895ccf05c178357d90ee47338738e88d611a477"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.829391 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7fx9n" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.831508 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"261c319a-37da-4987-a774-ecc24fa6b083","Type":"ContainerStarted","Data":"5e71d5cf5b7f04e611003ded79bb677d979a68bb5d8c13a30a7adf835aa9fb4f"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.838635 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" event={"ID":"a8a0f7e4-fb87-44e1-aa94-af4ac6db619e","Type":"ContainerDied","Data":"cb35f0518089df11d7e4b07a9c654c785ddf77a73241db4b87e7507e8c22a37e"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.838671 4740 scope.go:117] "RemoveContainer" containerID="7b1267804c5ccdc596aae17ec026c3ad3e951e9204f02aed410068a96342f251" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.838785 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bs4mf" Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.842566 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerID="1cec8a7b9cc219185952d9b5c8dc01484d221e023628c8bc111888303dae7cbf" exitCode=0 Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.842625 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" event={"ID":"4e8caefd-2fb4-46aa-8d80-83376f1660bf","Type":"ContainerDied","Data":"1cec8a7b9cc219185952d9b5c8dc01484d221e023628c8bc111888303dae7cbf"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.842650 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" event={"ID":"4e8caefd-2fb4-46aa-8d80-83376f1660bf","Type":"ContainerStarted","Data":"670c86e8bd30170aedbd6f30555d0fd579661af3c790ed24f3fde6e75705c2b9"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.849157 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xbfj5" event={"ID":"2ca95a0e-df45-403b-b278-5701175ac8e1","Type":"ContainerStarted","Data":"4ea951fd01a313ed653fd9477982ca850edc2ee8d008831d93405de205cb2df0"} Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.945215 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7fx9n"] Jan 05 14:07:17 crc kubenswrapper[4740]: I0105 14:07:17.972690 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7fx9n"] Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.087375 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 05 14:07:18 crc kubenswrapper[4740]: W0105 14:07:18.143569 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf526b0a_998c_4943_bfba_04352421ed58.slice/crio-76f9320683540ace5e3ebf36fb2fbb92bb6059f0d4268ffe1f192effad3aef16 WatchSource:0}: Error finding container 76f9320683540ace5e3ebf36fb2fbb92bb6059f0d4268ffe1f192effad3aef16: Status 404 returned error can't find the container with id 76f9320683540ace5e3ebf36fb2fbb92bb6059f0d4268ffe1f192effad3aef16 Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.363863 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-config" (OuterVolumeSpecName: "config") pod "a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" (UID: "a8a0f7e4-fb87-44e1-aa94-af4ac6db619e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.374601 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" (UID: "a8a0f7e4-fb87-44e1-aa94-af4ac6db619e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.445892 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.445922 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.545237 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bs4mf"] Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.558690 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bs4mf"] Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.862357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4396968c-d77b-434d-888f-3ab578514bbe","Type":"ContainerStarted","Data":"993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.873056 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" event={"ID":"4e8caefd-2fb4-46aa-8d80-83376f1660bf","Type":"ContainerStarted","Data":"440d81a7e674e2fb34a0c4fc0c48b4f75befeeaa3e0a62c261530046639992fd"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.878262 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerID="14b51bc5cf0f6a9466bff6b637aaa821d11333c346faa4e07111305c47be26aa" exitCode=0 Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.878351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbb5263b-e98b-48a4-825e-ffb99738059f","Type":"ContainerDied","Data":"14b51bc5cf0f6a9466bff6b637aaa821d11333c346faa4e07111305c47be26aa"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.884221 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8wm86" event={"ID":"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7","Type":"ContainerStarted","Data":"b15349ac089e2a2a618e19ca3d7ef31eef8331d501a3fc3464404130fe69133e"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.884316 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8wm86" event={"ID":"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7","Type":"ContainerStarted","Data":"d6deec3f4e846994889d57c13da85c5977382f0de65d6a197c380bc5bf4c5abe"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.891171 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf526b0a-998c-4943-bfba-04352421ed58","Type":"ContainerStarted","Data":"76f9320683540ace5e3ebf36fb2fbb92bb6059f0d4268ffe1f192effad3aef16"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.896428 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xbfj5" event={"ID":"2ca95a0e-df45-403b-b278-5701175ac8e1","Type":"ContainerStarted","Data":"dd4d019b0512bbfce0243f04f3c4b3d366c4269cd9a9488bb1e5b0d6545a8464"} Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.987433 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b270e1a-8c5f-4c3f-b070-afa15ee2cda9" path="/var/lib/kubelet/pods/4b270e1a-8c5f-4c3f-b070-afa15ee2cda9/volumes" Jan 05 14:07:18 crc kubenswrapper[4740]: I0105 14:07:18.988603 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" path="/var/lib/kubelet/pods/a8a0f7e4-fb87-44e1-aa94-af4ac6db619e/volumes" Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.003132 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xbfj5" podStartSLOduration=3.003107324 podStartE2EDuration="3.003107324s" podCreationTimestamp="2026-01-05 14:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:18.948972107 +0000 UTC m=+1088.255880686" watchObservedRunningTime="2026-01-05 14:07:19.003107324 +0000 UTC m=+1088.310015913" Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.738403 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.904383 4740 generic.go:334] "Generic (PLEG): container finished" podID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerID="b15349ac089e2a2a618e19ca3d7ef31eef8331d501a3fc3464404130fe69133e" exitCode=0 Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.904446 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8wm86" event={"ID":"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7","Type":"ContainerDied","Data":"b15349ac089e2a2a618e19ca3d7ef31eef8331d501a3fc3464404130fe69133e"} Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.907645 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbb5263b-e98b-48a4-825e-ffb99738059f","Type":"ContainerStarted","Data":"835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888"} Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.954429 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" podStartSLOduration=3.9544132850000002 podStartE2EDuration="3.954413285s" podCreationTimestamp="2026-01-05 14:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:19.945324892 +0000 UTC m=+1089.252233471" watchObservedRunningTime="2026-01-05 14:07:19.954413285 +0000 UTC m=+1089.261321864" Jan 05 14:07:19 crc kubenswrapper[4740]: I0105 14:07:19.976048 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.118763667 podStartE2EDuration="43.976018303s" podCreationTimestamp="2026-01-05 14:06:36 +0000 UTC" firstStartedPulling="2026-01-05 14:06:38.52717637 +0000 UTC m=+1047.834084949" lastFinishedPulling="2026-01-05 14:07:14.384431006 +0000 UTC m=+1083.691339585" observedRunningTime="2026-01-05 14:07:19.96768144 +0000 UTC m=+1089.274590019" watchObservedRunningTime="2026-01-05 14:07:19.976018303 +0000 UTC m=+1089.282926872" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.464169 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.545729 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmtb8"] Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.577985 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nbmdk"] Jan 05 14:07:21 crc kubenswrapper[4740]: E0105 14:07:21.578407 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" containerName="init" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.578424 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" containerName="init" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.578627 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a0f7e4-fb87-44e1-aa94-af4ac6db619e" containerName="init" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.579680 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.734849 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.734894 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwtq\" (UniqueName: \"kubernetes.io/projected/6fe29f6d-955f-4f3f-b62f-634812236d3e-kube-api-access-tnwtq\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.735218 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.736172 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-config\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.736407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.838696 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.838887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-config\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.838988 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.839122 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.839183 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwtq\" (UniqueName: \"kubernetes.io/projected/6fe29f6d-955f-4f3f-b62f-634812236d3e-kube-api-access-tnwtq\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.840268 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-config\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.840314 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.840270 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.840342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.863680 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwtq\" (UniqueName: \"kubernetes.io/projected/6fe29f6d-955f-4f3f-b62f-634812236d3e-kube-api-access-tnwtq\") pod \"dnsmasq-dns-b8fbc5445-nbmdk\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:21 crc kubenswrapper[4740]: I0105 14:07:21.896527 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:22 crc kubenswrapper[4740]: I0105 14:07:22.013989 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerName="dnsmasq-dns" containerID="cri-o://440d81a7e674e2fb34a0c4fc0c48b4f75befeeaa3e0a62c261530046639992fd" gracePeriod=10 Jan 05 14:07:22 crc kubenswrapper[4740]: I0105 14:07:22.014897 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 05 14:07:22 crc kubenswrapper[4740]: I0105 14:07:22.017438 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nbmdk"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.136173 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.154326 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.155933 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.156629 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.156875 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.157326 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.157951 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4vzz4" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.267048 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-cache\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.267154 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgv6s\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-kube-api-access-lgv6s\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.267220 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.267493 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.267635 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-lock\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.369481 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-lock\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.369541 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-cache\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.369594 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgv6s\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-kube-api-access-lgv6s\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.369631 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.369680 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.369886 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.369901 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.369954 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift podName:e1b97919-20e0-4eb9-a60b-0f52a4d7c73b nodeName:}" failed. No retries permitted until 2026-01-05 14:07:23.86993813 +0000 UTC m=+1093.176846709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift") pod "swift-storage-0" (UID: "e1b97919-20e0-4eb9-a60b-0f52a4d7c73b") : configmap "swift-ring-files" not found Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.370031 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-lock\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.370158 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-cache\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.371183 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-g42rm"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.372380 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.372860 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.372896 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cd1f393f831151d1fa9e1c4e591f2d7f95e2b4876cde83e252c1aa6c0b367e72/globalmount\"" pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.375052 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.375239 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.391698 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.396044 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgv6s\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-kube-api-access-lgv6s\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.420551 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-g42rm"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472292 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-ring-data-devices\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472358 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-dispersionconf\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472355 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3b8c498-d76e-4b25-b480-8a1a0a4d62a0\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472529 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-combined-ca-bundle\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472641 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2004852-6aa3-4939-b9ee-143fec00054b-etc-swift\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472670 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxm7k\" (UniqueName: \"kubernetes.io/projected/e2004852-6aa3-4939-b9ee-143fec00054b-kube-api-access-dxm7k\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472702 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-swiftconf\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.472773 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-scripts\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.481763 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-g42rm"] Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.494362 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-dxm7k ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-g42rm" podUID="e2004852-6aa3-4939-b9ee-143fec00054b" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.495873 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qvtqr"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.497886 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.508049 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qvtqr"] Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.574168 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-ring-data-devices\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.574827 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2004852-6aa3-4939-b9ee-143fec00054b-etc-swift\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.575453 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxm7k\" (UniqueName: \"kubernetes.io/projected/e2004852-6aa3-4939-b9ee-143fec00054b-kube-api-access-dxm7k\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.575573 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-combined-ca-bundle\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.575671 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-swiftconf\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.575834 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-swiftconf\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.575931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-scripts\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576014 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-dispersionconf\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576128 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-scripts\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576231 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a24dd80e-d5b8-4449-96f3-d2682acd78c8-etc-swift\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.575401 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2004852-6aa3-4939-b9ee-143fec00054b-etc-swift\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576505 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-ring-data-devices\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576618 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dhtc\" (UniqueName: \"kubernetes.io/projected/a24dd80e-d5b8-4449-96f3-d2682acd78c8-kube-api-access-8dhtc\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576705 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-dispersionconf\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576839 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-combined-ca-bundle\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.576938 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-scripts\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.577327 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-ring-data-devices\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.582125 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-swiftconf\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.583013 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-combined-ca-bundle\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.583213 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-dispersionconf\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.595994 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxm7k\" (UniqueName: \"kubernetes.io/projected/e2004852-6aa3-4939-b9ee-143fec00054b-kube-api-access-dxm7k\") pod \"swift-ring-rebalance-g42rm\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.678311 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-swiftconf\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.678448 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-dispersionconf\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.678526 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-scripts\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.678949 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a24dd80e-d5b8-4449-96f3-d2682acd78c8-etc-swift\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.679095 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dhtc\" (UniqueName: \"kubernetes.io/projected/a24dd80e-d5b8-4449-96f3-d2682acd78c8-kube-api-access-8dhtc\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.679224 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-ring-data-devices\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.679269 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-scripts\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.679346 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a24dd80e-d5b8-4449-96f3-d2682acd78c8-etc-swift\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.679599 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-combined-ca-bundle\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.679773 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-ring-data-devices\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.682279 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-dispersionconf\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.682523 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-swiftconf\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.691416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-combined-ca-bundle\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.696846 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dhtc\" (UniqueName: \"kubernetes.io/projected/a24dd80e-d5b8-4449-96f3-d2682acd78c8-kube-api-access-8dhtc\") pod \"swift-ring-rebalance-qvtqr\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.815295 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:23 crc kubenswrapper[4740]: I0105 14:07:23.884631 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.884883 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.884931 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 05 14:07:23 crc kubenswrapper[4740]: E0105 14:07:23.885006 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift podName:e1b97919-20e0-4eb9-a60b-0f52a4d7c73b nodeName:}" failed. No retries permitted until 2026-01-05 14:07:24.884979959 +0000 UTC m=+1094.191888578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift") pod "swift-storage-0" (UID: "e1b97919-20e0-4eb9-a60b-0f52a4d7c73b") : configmap "swift-ring-files" not found Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.039681 4740 generic.go:334] "Generic (PLEG): container finished" podID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerID="0b97cb304c7443a8f612ae0939ddd0a643468732302ad8bbdc9539057e6d4f8a" exitCode=0 Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.039855 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerDied","Data":"0b97cb304c7443a8f612ae0939ddd0a643468732302ad8bbdc9539057e6d4f8a"} Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.043314 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eeb4c870-b0d8-4d92-82c1-aedb35200c4b","Type":"ContainerStarted","Data":"b3dbdf8362bc625ca04348e9bb74d68d3f768dd9528dbaeca36d3f3a321e26f6"} Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.045734 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerID="440d81a7e674e2fb34a0c4fc0c48b4f75befeeaa3e0a62c261530046639992fd" exitCode=0 Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.046049 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.045771 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" event={"ID":"4e8caefd-2fb4-46aa-8d80-83376f1660bf","Type":"ContainerDied","Data":"440d81a7e674e2fb34a0c4fc0c48b4f75befeeaa3e0a62c261530046639992fd"} Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.070404 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189503 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-ring-data-devices\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189581 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2004852-6aa3-4939-b9ee-143fec00054b-etc-swift\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189769 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxm7k\" (UniqueName: \"kubernetes.io/projected/e2004852-6aa3-4939-b9ee-143fec00054b-kube-api-access-dxm7k\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189820 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-combined-ca-bundle\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189963 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-dispersionconf\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189968 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2004852-6aa3-4939-b9ee-143fec00054b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.189995 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.190035 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-swiftconf\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.190111 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-scripts\") pod \"e2004852-6aa3-4939-b9ee-143fec00054b\" (UID: \"e2004852-6aa3-4939-b9ee-143fec00054b\") " Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.190688 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-scripts" (OuterVolumeSpecName: "scripts") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.191635 4740 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.191687 4740 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2004852-6aa3-4939-b9ee-143fec00054b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.191699 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2004852-6aa3-4939-b9ee-143fec00054b-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.193725 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.194262 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2004852-6aa3-4939-b9ee-143fec00054b-kube-api-access-dxm7k" (OuterVolumeSpecName: "kube-api-access-dxm7k") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "kube-api-access-dxm7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.194266 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.195666 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e2004852-6aa3-4939-b9ee-143fec00054b" (UID: "e2004852-6aa3-4939-b9ee-143fec00054b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.293484 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxm7k\" (UniqueName: \"kubernetes.io/projected/e2004852-6aa3-4939-b9ee-143fec00054b-kube-api-access-dxm7k\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.293533 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.293551 4740 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.293569 4740 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2004852-6aa3-4939-b9ee-143fec00054b-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:24 crc kubenswrapper[4740]: I0105 14:07:24.905656 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:24 crc kubenswrapper[4740]: E0105 14:07:24.905986 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 05 14:07:24 crc kubenswrapper[4740]: E0105 14:07:24.906010 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 05 14:07:24 crc kubenswrapper[4740]: E0105 14:07:24.906078 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift podName:e1b97919-20e0-4eb9-a60b-0f52a4d7c73b nodeName:}" failed. No retries permitted until 2026-01-05 14:07:26.906042725 +0000 UTC m=+1096.212951324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift") pod "swift-storage-0" (UID: "e1b97919-20e0-4eb9-a60b-0f52a4d7c73b") : configmap "swift-ring-files" not found Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.086269 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g42rm" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.146316 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-g42rm"] Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.154481 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-g42rm"] Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.481225 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nbmdk"] Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.502872 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qvtqr"] Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.735662 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.834465 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-ovsdbserver-nb\") pod \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.834550 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-dns-svc\") pod \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.834686 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-config\") pod \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.834857 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqcmn\" (UniqueName: \"kubernetes.io/projected/4e8caefd-2fb4-46aa-8d80-83376f1660bf-kube-api-access-sqcmn\") pod \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\" (UID: \"4e8caefd-2fb4-46aa-8d80-83376f1660bf\") " Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.840428 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8caefd-2fb4-46aa-8d80-83376f1660bf-kube-api-access-sqcmn" (OuterVolumeSpecName: "kube-api-access-sqcmn") pod "4e8caefd-2fb4-46aa-8d80-83376f1660bf" (UID: "4e8caefd-2fb4-46aa-8d80-83376f1660bf"). InnerVolumeSpecName "kube-api-access-sqcmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.886727 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e8caefd-2fb4-46aa-8d80-83376f1660bf" (UID: "4e8caefd-2fb4-46aa-8d80-83376f1660bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.901373 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-config" (OuterVolumeSpecName: "config") pod "4e8caefd-2fb4-46aa-8d80-83376f1660bf" (UID: "4e8caefd-2fb4-46aa-8d80-83376f1660bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.904671 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e8caefd-2fb4-46aa-8d80-83376f1660bf" (UID: "4e8caefd-2fb4-46aa-8d80-83376f1660bf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.938467 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.938521 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.938539 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqcmn\" (UniqueName: \"kubernetes.io/projected/4e8caefd-2fb4-46aa-8d80-83376f1660bf-kube-api-access-sqcmn\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:25 crc kubenswrapper[4740]: I0105 14:07:25.938558 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e8caefd-2fb4-46aa-8d80-83376f1660bf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.104508 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" event={"ID":"4e8caefd-2fb4-46aa-8d80-83376f1660bf","Type":"ContainerDied","Data":"670c86e8bd30170aedbd6f30555d0fd579661af3c790ed24f3fde6e75705c2b9"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.104678 4740 scope.go:117] "RemoveContainer" containerID="440d81a7e674e2fb34a0c4fc0c48b4f75befeeaa3e0a62c261530046639992fd" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.104593 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-tmtb8" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.106553 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qvtqr" event={"ID":"a24dd80e-d5b8-4449-96f3-d2682acd78c8","Type":"ContainerStarted","Data":"42efa0f2a035838b15f34b5fa614bcaaf0eb6b5de6c340c3bf3c0e2ae23fcb5e"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.109554 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299","Type":"ContainerStarted","Data":"ec151a9ec49a8c4a5f947daa6a8a86ed246e271c624f6ccbece01f1324acdbef"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.114514 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8wm86" event={"ID":"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7","Type":"ContainerStarted","Data":"d5b31cdbad2ff38f36564588d2cd40bd23cff3cf5d6c256d1984b2468de9fc90"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.114718 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.117582 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf526b0a-998c-4943-bfba-04352421ed58","Type":"ContainerStarted","Data":"c49ae39da56d7108d17a78edc45b511df96cc67d94837688199a076f018a9178"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.117605 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf526b0a-998c-4943-bfba-04352421ed58","Type":"ContainerStarted","Data":"51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.117740 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.119315 4740 generic.go:334] "Generic (PLEG): container finished" podID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerID="81339fa256d495423f7e573ef3c248ecb52a7346c675a5aab6bc5ce2f5eabc65" exitCode=0 Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.119364 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" event={"ID":"6fe29f6d-955f-4f3f-b62f-634812236d3e","Type":"ContainerDied","Data":"81339fa256d495423f7e573ef3c248ecb52a7346c675a5aab6bc5ce2f5eabc65"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.119410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" event={"ID":"6fe29f6d-955f-4f3f-b62f-634812236d3e","Type":"ContainerStarted","Data":"3b6ac88baff9332edc930d2589593a0906ddb1e0ff08811a2172e72d15216f6c"} Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.148134 4740 scope.go:117] "RemoveContainer" containerID="1cec8a7b9cc219185952d9b5c8dc01484d221e023628c8bc111888303dae7cbf" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.217890 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-8wm86" podStartSLOduration=10.217872533 podStartE2EDuration="10.217872533s" podCreationTimestamp="2026-01-05 14:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:26.209790597 +0000 UTC m=+1095.516699176" watchObservedRunningTime="2026-01-05 14:07:26.217872533 +0000 UTC m=+1095.524781112" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.236230 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmtb8"] Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.247267 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-tmtb8"] Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.257156 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.7678787470000001 podStartE2EDuration="9.257134563s" podCreationTimestamp="2026-01-05 14:07:17 +0000 UTC" firstStartedPulling="2026-01-05 14:07:18.146045763 +0000 UTC m=+1087.452954342" lastFinishedPulling="2026-01-05 14:07:25.635301579 +0000 UTC m=+1094.942210158" observedRunningTime="2026-01-05 14:07:26.243925359 +0000 UTC m=+1095.550833968" watchObservedRunningTime="2026-01-05 14:07:26.257134563 +0000 UTC m=+1095.564043162" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.981362 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" path="/var/lib/kubelet/pods/4e8caefd-2fb4-46aa-8d80-83376f1660bf/volumes" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.982484 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2004852-6aa3-4939-b9ee-143fec00054b" path="/var/lib/kubelet/pods/e2004852-6aa3-4939-b9ee-143fec00054b/volumes" Jan 05 14:07:26 crc kubenswrapper[4740]: I0105 14:07:26.993085 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:26 crc kubenswrapper[4740]: E0105 14:07:26.993488 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 05 14:07:26 crc kubenswrapper[4740]: E0105 14:07:26.993514 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 05 14:07:26 crc kubenswrapper[4740]: E0105 14:07:26.993574 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift podName:e1b97919-20e0-4eb9-a60b-0f52a4d7c73b nodeName:}" failed. No retries permitted until 2026-01-05 14:07:30.993555559 +0000 UTC m=+1100.300464148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift") pod "swift-storage-0" (UID: "e1b97919-20e0-4eb9-a60b-0f52a4d7c73b") : configmap "swift-ring-files" not found Jan 05 14:07:27 crc kubenswrapper[4740]: I0105 14:07:27.130167 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" event={"ID":"6fe29f6d-955f-4f3f-b62f-634812236d3e","Type":"ContainerStarted","Data":"c87abca0d72793970bea77bdca8ffb24f0f03ec87bb5bad620c09fc498aca5fd"} Jan 05 14:07:27 crc kubenswrapper[4740]: I0105 14:07:27.130382 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:27 crc kubenswrapper[4740]: I0105 14:07:27.147081 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" podStartSLOduration=6.147047973 podStartE2EDuration="6.147047973s" podCreationTimestamp="2026-01-05 14:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:27.143080396 +0000 UTC m=+1096.449988975" watchObservedRunningTime="2026-01-05 14:07:27.147047973 +0000 UTC m=+1096.453956552" Jan 05 14:07:27 crc kubenswrapper[4740]: I0105 14:07:27.835600 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 05 14:07:27 crc kubenswrapper[4740]: I0105 14:07:27.835903 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 05 14:07:28 crc kubenswrapper[4740]: I0105 14:07:28.254516 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 05 14:07:28 crc kubenswrapper[4740]: I0105 14:07:28.327939 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output=< Jan 05 14:07:28 crc kubenswrapper[4740]: wsrep_local_state_comment (Joined) differs from Synced Jan 05 14:07:28 crc kubenswrapper[4740]: > Jan 05 14:07:30 crc kubenswrapper[4740]: I0105 14:07:30.173295 4740 generic.go:334] "Generic (PLEG): container finished" podID="261c319a-37da-4987-a774-ecc24fa6b083" containerID="5e71d5cf5b7f04e611003ded79bb677d979a68bb5d8c13a30a7adf835aa9fb4f" exitCode=0 Jan 05 14:07:30 crc kubenswrapper[4740]: I0105 14:07:30.173596 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"261c319a-37da-4987-a774-ecc24fa6b083","Type":"ContainerDied","Data":"5e71d5cf5b7f04e611003ded79bb677d979a68bb5d8c13a30a7adf835aa9fb4f"} Jan 05 14:07:30 crc kubenswrapper[4740]: I0105 14:07:30.999252 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:31 crc kubenswrapper[4740]: E0105 14:07:30.999509 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 05 14:07:31 crc kubenswrapper[4740]: E0105 14:07:30.999831 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 05 14:07:31 crc kubenswrapper[4740]: E0105 14:07:30.999881 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift podName:e1b97919-20e0-4eb9-a60b-0f52a4d7c73b nodeName:}" failed. No retries permitted until 2026-01-05 14:07:38.999866118 +0000 UTC m=+1108.306774697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift") pod "swift-storage-0" (UID: "e1b97919-20e0-4eb9-a60b-0f52a4d7c73b") : configmap "swift-ring-files" not found Jan 05 14:07:31 crc kubenswrapper[4740]: I0105 14:07:31.899734 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:07:31 crc kubenswrapper[4740]: I0105 14:07:31.962469 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8wm86"] Jan 05 14:07:31 crc kubenswrapper[4740]: I0105 14:07:31.962821 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-8wm86" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerName="dnsmasq-dns" containerID="cri-o://d5b31cdbad2ff38f36564588d2cd40bd23cff3cf5d6c256d1984b2468de9fc90" gracePeriod=10 Jan 05 14:07:31 crc kubenswrapper[4740]: I0105 14:07:31.967290 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.195458 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"261c319a-37da-4987-a774-ecc24fa6b083","Type":"ContainerStarted","Data":"7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e"} Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.201938 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qvtqr" event={"ID":"a24dd80e-d5b8-4449-96f3-d2682acd78c8","Type":"ContainerStarted","Data":"3a89a34dd527608fbe49e808d35a2ddac1d98be39e2c95dfcd32ce24c5510dbb"} Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.203905 4740 generic.go:334] "Generic (PLEG): container finished" podID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerID="d5b31cdbad2ff38f36564588d2cd40bd23cff3cf5d6c256d1984b2468de9fc90" exitCode=0 Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.203949 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8wm86" event={"ID":"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7","Type":"ContainerDied","Data":"d5b31cdbad2ff38f36564588d2cd40bd23cff3cf5d6c256d1984b2468de9fc90"} Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.207924 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerStarted","Data":"4a75b8e7c93c609a1897185d6697b21e1f68678db7e22a100668ce9bbd67f312"} Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.223169 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371981.631626 podStartE2EDuration="55.223149699s" podCreationTimestamp="2026-01-05 14:06:37 +0000 UTC" firstStartedPulling="2026-01-05 14:06:40.875719893 +0000 UTC m=+1050.182628472" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:32.222145802 +0000 UTC m=+1101.529054381" watchObservedRunningTime="2026-01-05 14:07:32.223149699 +0000 UTC m=+1101.530058278" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.249009 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qvtqr" podStartSLOduration=3.1048773020000002 podStartE2EDuration="9.24899204s" podCreationTimestamp="2026-01-05 14:07:23 +0000 UTC" firstStartedPulling="2026-01-05 14:07:25.482811973 +0000 UTC m=+1094.789720552" lastFinishedPulling="2026-01-05 14:07:31.626926721 +0000 UTC m=+1100.933835290" observedRunningTime="2026-01-05 14:07:32.23853772 +0000 UTC m=+1101.545446299" watchObservedRunningTime="2026-01-05 14:07:32.24899204 +0000 UTC m=+1101.555900619" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.489060 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.541390 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-config\") pod \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.541759 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-nb\") pod \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.541871 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llbqr\" (UniqueName: \"kubernetes.io/projected/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-kube-api-access-llbqr\") pod \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.541986 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-sb\") pod \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.542110 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-dns-svc\") pod \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\" (UID: \"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7\") " Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.551546 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-kube-api-access-llbqr" (OuterVolumeSpecName: "kube-api-access-llbqr") pod "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" (UID: "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7"). InnerVolumeSpecName "kube-api-access-llbqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.602790 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" (UID: "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.610485 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-config" (OuterVolumeSpecName: "config") pod "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" (UID: "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.610539 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" (UID: "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.617448 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" (UID: "0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.644571 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.644921 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llbqr\" (UniqueName: \"kubernetes.io/projected/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-kube-api-access-llbqr\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.644934 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.644945 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:32 crc kubenswrapper[4740]: I0105 14:07:32.644958 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:33 crc kubenswrapper[4740]: I0105 14:07:33.229725 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8wm86" event={"ID":"0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7","Type":"ContainerDied","Data":"d6deec3f4e846994889d57c13da85c5977382f0de65d6a197c380bc5bf4c5abe"} Jan 05 14:07:33 crc kubenswrapper[4740]: I0105 14:07:33.229777 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8wm86" Jan 05 14:07:33 crc kubenswrapper[4740]: I0105 14:07:33.229798 4740 scope.go:117] "RemoveContainer" containerID="d5b31cdbad2ff38f36564588d2cd40bd23cff3cf5d6c256d1984b2468de9fc90" Jan 05 14:07:33 crc kubenswrapper[4740]: I0105 14:07:33.271247 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8wm86"] Jan 05 14:07:33 crc kubenswrapper[4740]: I0105 14:07:33.281390 4740 scope.go:117] "RemoveContainer" containerID="b15349ac089e2a2a618e19ca3d7ef31eef8331d501a3fc3464404130fe69133e" Jan 05 14:07:33 crc kubenswrapper[4740]: I0105 14:07:33.287925 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8wm86"] Jan 05 14:07:34 crc kubenswrapper[4740]: I0105 14:07:34.983388 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" path="/var/lib/kubelet/pods/0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7/volumes" Jan 05 14:07:37 crc kubenswrapper[4740]: I0105 14:07:37.113812 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:07:37 crc kubenswrapper[4740]: I0105 14:07:37.280221 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerStarted","Data":"1e18d3a01fd61abd767af8db666a3962b57b0a7bd221181f75df6fdce6d0a5c0"} Jan 05 14:07:37 crc kubenswrapper[4740]: I0105 14:07:37.541029 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 05 14:07:37 crc kubenswrapper[4740]: I0105 14:07:37.923597 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 05 14:07:38 crc kubenswrapper[4740]: I0105 14:07:38.930224 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6bbc7d56d-v8cvv" podUID="8ea2f4af-f899-4832-98f1-e56e7665d2ba" containerName="console" containerID="cri-o://8d4c6e477b93e27520954359820f3f9f67bc6f598d363d22ec23f63f17a55cc3" gracePeriod=15 Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.009840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.010026 4740 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.010047 4740 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.010125 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift podName:e1b97919-20e0-4eb9-a60b-0f52a4d7c73b nodeName:}" failed. No retries permitted until 2026-01-05 14:07:55.010106534 +0000 UTC m=+1124.317015113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift") pod "swift-storage-0" (UID: "e1b97919-20e0-4eb9-a60b-0f52a4d7c73b") : configmap "swift-ring-files" not found Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.304896 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbc7d56d-v8cvv_8ea2f4af-f899-4832-98f1-e56e7665d2ba/console/0.log" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.305165 4740 generic.go:334] "Generic (PLEG): container finished" podID="8ea2f4af-f899-4832-98f1-e56e7665d2ba" containerID="8d4c6e477b93e27520954359820f3f9f67bc6f598d363d22ec23f63f17a55cc3" exitCode=2 Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.305218 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbc7d56d-v8cvv" event={"ID":"8ea2f4af-f899-4832-98f1-e56e7665d2ba","Type":"ContainerDied","Data":"8d4c6e477b93e27520954359820f3f9f67bc6f598d363d22ec23f63f17a55cc3"} Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.308563 4740 generic.go:334] "Generic (PLEG): container finished" podID="a24dd80e-d5b8-4449-96f3-d2682acd78c8" containerID="3a89a34dd527608fbe49e808d35a2ddac1d98be39e2c95dfcd32ce24c5510dbb" exitCode=0 Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.308586 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qvtqr" event={"ID":"a24dd80e-d5b8-4449-96f3-d2682acd78c8","Type":"ContainerDied","Data":"3a89a34dd527608fbe49e808d35a2ddac1d98be39e2c95dfcd32ce24c5510dbb"} Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.462364 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8281-account-create-update-fmlwx"] Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.462866 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerName="dnsmasq-dns" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.462882 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerName="dnsmasq-dns" Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.462900 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerName="init" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.462907 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerName="init" Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.462923 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerName="init" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.462932 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerName="init" Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.462945 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerName="dnsmasq-dns" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.462952 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerName="dnsmasq-dns" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.463218 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8caefd-2fb4-46aa-8d80-83376f1660bf" containerName="dnsmasq-dns" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.463248 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bda3b7e-3253-46a7-b1c5-1e48e8ac2db7" containerName="dnsmasq-dns" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.463931 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.466509 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.471094 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rm8zj"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.472416 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.489396 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rm8zj"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.503103 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8281-account-create-update-fmlwx"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.514827 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbc7d56d-v8cvv_8ea2f4af-f899-4832-98f1-e56e7665d2ba/console/0.log" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.514882 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.520816 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-operator-scripts\") pod \"keystone-8281-account-create-update-fmlwx\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.521315 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbddx\" (UniqueName: \"kubernetes.io/projected/25773a1d-61fa-499b-a0b5-43dfdaafa421-kube-api-access-dbddx\") pod \"keystone-db-create-rm8zj\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.521369 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25773a1d-61fa-499b-a0b5-43dfdaafa421-operator-scripts\") pod \"keystone-db-create-rm8zj\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.521567 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqc92\" (UniqueName: \"kubernetes.io/projected/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-kube-api-access-vqc92\") pod \"keystone-8281-account-create-update-fmlwx\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.551008 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-v4lk2"] Jan 05 14:07:39 crc kubenswrapper[4740]: E0105 14:07:39.552286 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea2f4af-f899-4832-98f1-e56e7665d2ba" containerName="console" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.552404 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea2f4af-f899-4832-98f1-e56e7665d2ba" containerName="console" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.553524 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea2f4af-f899-4832-98f1-e56e7665d2ba" containerName="console" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.554756 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.598678 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-v4lk2"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622374 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-oauth-config\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622433 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-config\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622465 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-trusted-ca-bundle\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622540 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-serving-cert\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622570 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-service-ca\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n2d8\" (UniqueName: \"kubernetes.io/projected/8ea2f4af-f899-4832-98f1-e56e7665d2ba-kube-api-access-8n2d8\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.622737 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-oauth-serving-cert\") pod \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\" (UID: \"8ea2f4af-f899-4832-98f1-e56e7665d2ba\") " Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623205 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-service-ca" (OuterVolumeSpecName: "service-ca") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623247 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623328 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-config" (OuterVolumeSpecName: "console-config") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623332 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxjs\" (UniqueName: \"kubernetes.io/projected/236225ce-ccde-40de-a45f-c19e52ece918-kube-api-access-nnxjs\") pod \"placement-db-create-v4lk2\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623419 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbddx\" (UniqueName: \"kubernetes.io/projected/25773a1d-61fa-499b-a0b5-43dfdaafa421-kube-api-access-dbddx\") pod \"keystone-db-create-rm8zj\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623570 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25773a1d-61fa-499b-a0b5-43dfdaafa421-operator-scripts\") pod \"keystone-db-create-rm8zj\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623631 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/236225ce-ccde-40de-a45f-c19e52ece918-operator-scripts\") pod \"placement-db-create-v4lk2\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623635 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.623980 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqc92\" (UniqueName: \"kubernetes.io/projected/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-kube-api-access-vqc92\") pod \"keystone-8281-account-create-update-fmlwx\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.624220 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-operator-scripts\") pod \"keystone-8281-account-create-update-fmlwx\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.624317 4740 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-service-ca\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.624339 4740 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.624352 4740 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.624362 4740 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea2f4af-f899-4832-98f1-e56e7665d2ba-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.624829 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-operator-scripts\") pod \"keystone-8281-account-create-update-fmlwx\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.625722 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25773a1d-61fa-499b-a0b5-43dfdaafa421-operator-scripts\") pod \"keystone-db-create-rm8zj\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.633391 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.633429 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.633492 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea2f4af-f899-4832-98f1-e56e7665d2ba-kube-api-access-8n2d8" (OuterVolumeSpecName: "kube-api-access-8n2d8") pod "8ea2f4af-f899-4832-98f1-e56e7665d2ba" (UID: "8ea2f4af-f899-4832-98f1-e56e7665d2ba"). InnerVolumeSpecName "kube-api-access-8n2d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.641229 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbddx\" (UniqueName: \"kubernetes.io/projected/25773a1d-61fa-499b-a0b5-43dfdaafa421-kube-api-access-dbddx\") pod \"keystone-db-create-rm8zj\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.646685 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqc92\" (UniqueName: \"kubernetes.io/projected/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-kube-api-access-vqc92\") pod \"keystone-8281-account-create-update-fmlwx\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.675255 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.675293 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.677037 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5361-account-create-update-m2b2w"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.678442 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.680720 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.688267 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5361-account-create-update-m2b2w"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.725727 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2589e83-7430-45b3-95ad-a91a164948a6-operator-scripts\") pod \"placement-5361-account-create-update-m2b2w\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.725834 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpm92\" (UniqueName: \"kubernetes.io/projected/c2589e83-7430-45b3-95ad-a91a164948a6-kube-api-access-dpm92\") pod \"placement-5361-account-create-update-m2b2w\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.725886 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxjs\" (UniqueName: \"kubernetes.io/projected/236225ce-ccde-40de-a45f-c19e52ece918-kube-api-access-nnxjs\") pod \"placement-db-create-v4lk2\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.726204 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/236225ce-ccde-40de-a45f-c19e52ece918-operator-scripts\") pod \"placement-db-create-v4lk2\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.726995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/236225ce-ccde-40de-a45f-c19e52ece918-operator-scripts\") pod \"placement-db-create-v4lk2\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.727940 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n2d8\" (UniqueName: \"kubernetes.io/projected/8ea2f4af-f899-4832-98f1-e56e7665d2ba-kube-api-access-8n2d8\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.728031 4740 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.728211 4740 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ea2f4af-f899-4832-98f1-e56e7665d2ba-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.745583 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxjs\" (UniqueName: \"kubernetes.io/projected/236225ce-ccde-40de-a45f-c19e52ece918-kube-api-access-nnxjs\") pod \"placement-db-create-v4lk2\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.766201 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.830284 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2589e83-7430-45b3-95ad-a91a164948a6-operator-scripts\") pod \"placement-5361-account-create-update-m2b2w\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.830414 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpm92\" (UniqueName: \"kubernetes.io/projected/c2589e83-7430-45b3-95ad-a91a164948a6-kube-api-access-dpm92\") pod \"placement-5361-account-create-update-m2b2w\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.832183 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2589e83-7430-45b3-95ad-a91a164948a6-operator-scripts\") pod \"placement-5361-account-create-update-m2b2w\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.833574 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.846077 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.849361 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpm92\" (UniqueName: \"kubernetes.io/projected/c2589e83-7430-45b3-95ad-a91a164948a6-kube-api-access-dpm92\") pod \"placement-5361-account-create-update-m2b2w\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.874283 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.940375 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wmlbx"] Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.942159 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:39 crc kubenswrapper[4740]: I0105 14:07:39.952357 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wmlbx"] Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.032428 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.036245 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7qq\" (UniqueName: \"kubernetes.io/projected/77686b86-ebf6-47a1-9018-a32f1a089def-kube-api-access-kt7qq\") pod \"glance-db-create-wmlbx\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.036292 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77686b86-ebf6-47a1-9018-a32f1a089def-operator-scripts\") pod \"glance-db-create-wmlbx\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.045452 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-98ff-account-create-update-48ls5"] Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.047465 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.051387 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.065211 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-98ff-account-create-update-48ls5"] Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.137763 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sm4t\" (UniqueName: \"kubernetes.io/projected/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-kube-api-access-9sm4t\") pod \"glance-98ff-account-create-update-48ls5\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.137840 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-operator-scripts\") pod \"glance-98ff-account-create-update-48ls5\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.137924 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7qq\" (UniqueName: \"kubernetes.io/projected/77686b86-ebf6-47a1-9018-a32f1a089def-kube-api-access-kt7qq\") pod \"glance-db-create-wmlbx\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.137964 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77686b86-ebf6-47a1-9018-a32f1a089def-operator-scripts\") pod \"glance-db-create-wmlbx\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.138874 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77686b86-ebf6-47a1-9018-a32f1a089def-operator-scripts\") pod \"glance-db-create-wmlbx\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.156146 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7qq\" (UniqueName: \"kubernetes.io/projected/77686b86-ebf6-47a1-9018-a32f1a089def-kube-api-access-kt7qq\") pod \"glance-db-create-wmlbx\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.244165 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sm4t\" (UniqueName: \"kubernetes.io/projected/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-kube-api-access-9sm4t\") pod \"glance-98ff-account-create-update-48ls5\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.244247 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-operator-scripts\") pod \"glance-98ff-account-create-update-48ls5\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.244972 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-operator-scripts\") pod \"glance-98ff-account-create-update-48ls5\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.261334 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sm4t\" (UniqueName: \"kubernetes.io/projected/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-kube-api-access-9sm4t\") pod \"glance-98ff-account-create-update-48ls5\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.269799 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.325986 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbc7d56d-v8cvv_8ea2f4af-f899-4832-98f1-e56e7665d2ba/console/0.log" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.326359 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbc7d56d-v8cvv" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.329342 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbc7d56d-v8cvv" event={"ID":"8ea2f4af-f899-4832-98f1-e56e7665d2ba","Type":"ContainerDied","Data":"70536a2b66e03324823bf68de0d7a448842aa5a0c22133bae87b1574c6eb8a50"} Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.329382 4740 scope.go:117] "RemoveContainer" containerID="8d4c6e477b93e27520954359820f3f9f67bc6f598d363d22ec23f63f17a55cc3" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.376700 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bbc7d56d-v8cvv"] Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.389751 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bbc7d56d-v8cvv"] Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.405181 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.424405 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 05 14:07:40 crc kubenswrapper[4740]: I0105 14:07:40.981502 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea2f4af-f899-4832-98f1-e56e7665d2ba" path="/var/lib/kubelet/pods/8ea2f4af-f899-4832-98f1-e56e7665d2ba/volumes" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.173398 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273301 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-ring-data-devices\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273483 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-swiftconf\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273521 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a24dd80e-d5b8-4449-96f3-d2682acd78c8-etc-swift\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273566 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-combined-ca-bundle\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273687 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dhtc\" (UniqueName: \"kubernetes.io/projected/a24dd80e-d5b8-4449-96f3-d2682acd78c8-kube-api-access-8dhtc\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273745 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-dispersionconf\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.273799 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-scripts\") pod \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\" (UID: \"a24dd80e-d5b8-4449-96f3-d2682acd78c8\") " Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.274509 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.275903 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24dd80e-d5b8-4449-96f3-d2682acd78c8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.290445 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24dd80e-d5b8-4449-96f3-d2682acd78c8-kube-api-access-8dhtc" (OuterVolumeSpecName: "kube-api-access-8dhtc") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "kube-api-access-8dhtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.301265 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.322043 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-scripts" (OuterVolumeSpecName: "scripts") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.330983 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.338890 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qvtqr" event={"ID":"a24dd80e-d5b8-4449-96f3-d2682acd78c8","Type":"ContainerDied","Data":"42efa0f2a035838b15f34b5fa614bcaaf0eb6b5de6c340c3bf3c0e2ae23fcb5e"} Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.338930 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42efa0f2a035838b15f34b5fa614bcaaf0eb6b5de6c340c3bf3c0e2ae23fcb5e" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.338960 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qvtqr" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.344666 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a24dd80e-d5b8-4449-96f3-d2682acd78c8" (UID: "a24dd80e-d5b8-4449-96f3-d2682acd78c8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.371783 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fbfrn"] Jan 05 14:07:41 crc kubenswrapper[4740]: E0105 14:07:41.372359 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24dd80e-d5b8-4449-96f3-d2682acd78c8" containerName="swift-ring-rebalance" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.372623 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24dd80e-d5b8-4449-96f3-d2682acd78c8" containerName="swift-ring-rebalance" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.372955 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24dd80e-d5b8-4449-96f3-d2682acd78c8" containerName="swift-ring-rebalance" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.373741 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376182 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376222 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dhtc\" (UniqueName: \"kubernetes.io/projected/a24dd80e-d5b8-4449-96f3-d2682acd78c8-kube-api-access-8dhtc\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376238 4740 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376251 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376264 4740 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a24dd80e-d5b8-4449-96f3-d2682acd78c8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376274 4740 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a24dd80e-d5b8-4449-96f3-d2682acd78c8-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.376282 4740 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a24dd80e-d5b8-4449-96f3-d2682acd78c8-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.395199 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fbfrn"] Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.582040 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-fbfrn\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.582374 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlkm\" (UniqueName: \"kubernetes.io/projected/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-kube-api-access-7hlkm\") pod \"mysqld-exporter-openstack-db-create-fbfrn\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.584728 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-506a-account-create-update-w45h2"] Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.586209 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.590368 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.600476 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-506a-account-create-update-w45h2"] Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.684688 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-fbfrn\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.684744 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63b41b33-b5bf-418b-b5c5-595e952f1380-operator-scripts\") pod \"mysqld-exporter-506a-account-create-update-w45h2\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.684771 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlkm\" (UniqueName: \"kubernetes.io/projected/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-kube-api-access-7hlkm\") pod \"mysqld-exporter-openstack-db-create-fbfrn\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.684828 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jgsg\" (UniqueName: \"kubernetes.io/projected/63b41b33-b5bf-418b-b5c5-595e952f1380-kube-api-access-7jgsg\") pod \"mysqld-exporter-506a-account-create-update-w45h2\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.686400 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-fbfrn\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.721493 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlkm\" (UniqueName: \"kubernetes.io/projected/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-kube-api-access-7hlkm\") pod \"mysqld-exporter-openstack-db-create-fbfrn\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.787707 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63b41b33-b5bf-418b-b5c5-595e952f1380-operator-scripts\") pod \"mysqld-exporter-506a-account-create-update-w45h2\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.788096 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jgsg\" (UniqueName: \"kubernetes.io/projected/63b41b33-b5bf-418b-b5c5-595e952f1380-kube-api-access-7jgsg\") pod \"mysqld-exporter-506a-account-create-update-w45h2\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.788727 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.788832 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63b41b33-b5bf-418b-b5c5-595e952f1380-operator-scripts\") pod \"mysqld-exporter-506a-account-create-update-w45h2\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.808003 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jgsg\" (UniqueName: \"kubernetes.io/projected/63b41b33-b5bf-418b-b5c5-595e952f1380-kube-api-access-7jgsg\") pod \"mysqld-exporter-506a-account-create-update-w45h2\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.814569 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8281-account-create-update-fmlwx"] Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.903496 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.973470 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rm8zj"] Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.986165 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-98ff-account-create-update-48ls5"] Jan 05 14:07:41 crc kubenswrapper[4740]: I0105 14:07:41.999282 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5361-account-create-update-m2b2w"] Jan 05 14:07:42 crc kubenswrapper[4740]: W0105 14:07:42.008382 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8a07a26_5a7e_49b8_8445_bf1ce2bdbe00.slice/crio-02378e5c8beebda32e97c75a5607dd9ebd52f377a20d70dcadc8ea64b35cf5a6 WatchSource:0}: Error finding container 02378e5c8beebda32e97c75a5607dd9ebd52f377a20d70dcadc8ea64b35cf5a6: Status 404 returned error can't find the container with id 02378e5c8beebda32e97c75a5607dd9ebd52f377a20d70dcadc8ea64b35cf5a6 Jan 05 14:07:42 crc kubenswrapper[4740]: W0105 14:07:42.018501 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2589e83_7430_45b3_95ad_a91a164948a6.slice/crio-26e1f3db533d8bd49ba932d3c370b45593ba196d732f65ec21af9db3c592c36f WatchSource:0}: Error finding container 26e1f3db533d8bd49ba932d3c370b45593ba196d732f65ec21af9db3c592c36f: Status 404 returned error can't find the container with id 26e1f3db533d8bd49ba932d3c370b45593ba196d732f65ec21af9db3c592c36f Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.164936 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wmlbx"] Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.220643 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-v4lk2"] Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.357278 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerStarted","Data":"c87eb6635019a99f8f7e55591534536b05ba8a0c168b1758e8cf26729197c061"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.360031 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5361-account-create-update-m2b2w" event={"ID":"c2589e83-7430-45b3-95ad-a91a164948a6","Type":"ContainerStarted","Data":"26e1f3db533d8bd49ba932d3c370b45593ba196d732f65ec21af9db3c592c36f"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.370151 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8281-account-create-update-fmlwx" event={"ID":"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00","Type":"ContainerStarted","Data":"02378e5c8beebda32e97c75a5607dd9ebd52f377a20d70dcadc8ea64b35cf5a6"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.371567 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v4lk2" event={"ID":"236225ce-ccde-40de-a45f-c19e52ece918","Type":"ContainerStarted","Data":"227ec86b90b25c6386e8ba1a1b1a4c549f21bb1bfa55a0f896d3cb0dbf4838f2"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.372999 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wmlbx" event={"ID":"77686b86-ebf6-47a1-9018-a32f1a089def","Type":"ContainerStarted","Data":"483bfc20e54efa1d9a81a5d623591babec384e3e7ac20ddf4ef2a87182a45a32"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.374503 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rm8zj" event={"ID":"25773a1d-61fa-499b-a0b5-43dfdaafa421","Type":"ContainerStarted","Data":"b7f947fa252aba1b6b15b8f34c86bc87e9feb5b81da8a3797e5a00892337661c"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.376406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-98ff-account-create-update-48ls5" event={"ID":"220854d5-c397-408e-b4f1-4f5c7a9ab8b2","Type":"ContainerStarted","Data":"5e0e1e54625058361284319db97bfabcd1b4efb80c870e8de09eb9178e040f2b"} Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.389765 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=26.302438864 podStartE2EDuration="1m1.389747488s" podCreationTimestamp="2026-01-05 14:06:41 +0000 UTC" firstStartedPulling="2026-01-05 14:07:06.129806528 +0000 UTC m=+1075.436715107" lastFinishedPulling="2026-01-05 14:07:41.217115132 +0000 UTC m=+1110.524023731" observedRunningTime="2026-01-05 14:07:42.386727998 +0000 UTC m=+1111.693636597" watchObservedRunningTime="2026-01-05 14:07:42.389747488 +0000 UTC m=+1111.696656067" Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.584115 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fbfrn"] Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.708156 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-506a-account-create-update-w45h2"] Jan 05 14:07:42 crc kubenswrapper[4740]: W0105 14:07:42.709883 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63b41b33_b5bf_418b_b5c5_595e952f1380.slice/crio-84d3360168215b2a1ee86021954cf4aed302aaedd77586d7a8eff4a2525a3f1b WatchSource:0}: Error finding container 84d3360168215b2a1ee86021954cf4aed302aaedd77586d7a8eff4a2525a3f1b: Status 404 returned error can't find the container with id 84d3360168215b2a1ee86021954cf4aed302aaedd77586d7a8eff4a2525a3f1b Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.927711 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.927753 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:42 crc kubenswrapper[4740]: I0105 14:07:42.929772 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.389211 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8281-account-create-update-fmlwx" event={"ID":"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00","Type":"ContainerStarted","Data":"cb16ffa83ce73d0993c3934e6a9a4e8966c568106bbcc5373f6eb798a9d4d788"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.390831 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" event={"ID":"63b41b33-b5bf-418b-b5c5-595e952f1380","Type":"ContainerStarted","Data":"8472ad756cad61d21dcee5cb12bc9cc73223e9f11a71ed3979e4ba5b12b239af"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.390863 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" event={"ID":"63b41b33-b5bf-418b-b5c5-595e952f1380","Type":"ContainerStarted","Data":"84d3360168215b2a1ee86021954cf4aed302aaedd77586d7a8eff4a2525a3f1b"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.392722 4740 generic.go:334] "Generic (PLEG): container finished" podID="236225ce-ccde-40de-a45f-c19e52ece918" containerID="4d165707cc32025d51288c8e4e521265721edec5401826803867494484f1c03d" exitCode=0 Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.392826 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v4lk2" event={"ID":"236225ce-ccde-40de-a45f-c19e52ece918","Type":"ContainerDied","Data":"4d165707cc32025d51288c8e4e521265721edec5401826803867494484f1c03d"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.394287 4740 generic.go:334] "Generic (PLEG): container finished" podID="77686b86-ebf6-47a1-9018-a32f1a089def" containerID="81a3ee8545f2033c596c286d3ade79385f08d5f91f7b77d857a469f3eccbec3e" exitCode=0 Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.394344 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wmlbx" event={"ID":"77686b86-ebf6-47a1-9018-a32f1a089def","Type":"ContainerDied","Data":"81a3ee8545f2033c596c286d3ade79385f08d5f91f7b77d857a469f3eccbec3e"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.396085 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rm8zj" event={"ID":"25773a1d-61fa-499b-a0b5-43dfdaafa421","Type":"ContainerStarted","Data":"9bf43a34cde5a558352caf73d932ff7decda5087803faf196b44cdd9ea50b484"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.397768 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-98ff-account-create-update-48ls5" event={"ID":"220854d5-c397-408e-b4f1-4f5c7a9ab8b2","Type":"ContainerStarted","Data":"2e79f741f4173c037d418f97c730da16b0bba6d15f5f73c4a6341b4a76f3ad1f"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.400381 4740 generic.go:334] "Generic (PLEG): container finished" podID="c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" containerID="cd6b3b4eefadf7d19bd0fa339f92ed4fd32f5292fb84b9e63019f49df6acc14f" exitCode=0 Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.400478 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" event={"ID":"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889","Type":"ContainerDied","Data":"cd6b3b4eefadf7d19bd0fa339f92ed4fd32f5292fb84b9e63019f49df6acc14f"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.400505 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" event={"ID":"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889","Type":"ContainerStarted","Data":"e28b0647dfb078c58b9724e7d4a557d6d5517bd60d586f185368c866fd434589"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.403749 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5361-account-create-update-m2b2w" event={"ID":"c2589e83-7430-45b3-95ad-a91a164948a6","Type":"ContainerStarted","Data":"174a7349cf3ac32a5b531c896aeb97cc51cf326a1acf7a4c3e76f0f836a180ca"} Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.405467 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.414023 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8281-account-create-update-fmlwx" podStartSLOduration=4.414007839 podStartE2EDuration="4.414007839s" podCreationTimestamp="2026-01-05 14:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:43.404128875 +0000 UTC m=+1112.711037474" watchObservedRunningTime="2026-01-05 14:07:43.414007839 +0000 UTC m=+1112.720916418" Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.439605 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5361-account-create-update-m2b2w" podStartSLOduration=4.439584735 podStartE2EDuration="4.439584735s" podCreationTimestamp="2026-01-05 14:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:43.438160446 +0000 UTC m=+1112.745069045" watchObservedRunningTime="2026-01-05 14:07:43.439584735 +0000 UTC m=+1112.746493314" Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.483626 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" podStartSLOduration=2.483606473 podStartE2EDuration="2.483606473s" podCreationTimestamp="2026-01-05 14:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:43.467130972 +0000 UTC m=+1112.774039561" watchObservedRunningTime="2026-01-05 14:07:43.483606473 +0000 UTC m=+1112.790515052" Jan 05 14:07:43 crc kubenswrapper[4740]: I0105 14:07:43.490521 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-98ff-account-create-update-48ls5" podStartSLOduration=3.490504079 podStartE2EDuration="3.490504079s" podCreationTimestamp="2026-01-05 14:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:43.479668078 +0000 UTC m=+1112.786576657" watchObservedRunningTime="2026-01-05 14:07:43.490504079 +0000 UTC m=+1112.797412658" Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.413216 4740 generic.go:334] "Generic (PLEG): container finished" podID="c2589e83-7430-45b3-95ad-a91a164948a6" containerID="174a7349cf3ac32a5b531c896aeb97cc51cf326a1acf7a4c3e76f0f836a180ca" exitCode=0 Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.413318 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5361-account-create-update-m2b2w" event={"ID":"c2589e83-7430-45b3-95ad-a91a164948a6","Type":"ContainerDied","Data":"174a7349cf3ac32a5b531c896aeb97cc51cf326a1acf7a4c3e76f0f836a180ca"} Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.415380 4740 generic.go:334] "Generic (PLEG): container finished" podID="e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" containerID="cb16ffa83ce73d0993c3934e6a9a4e8966c568106bbcc5373f6eb798a9d4d788" exitCode=0 Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.415453 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8281-account-create-update-fmlwx" event={"ID":"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00","Type":"ContainerDied","Data":"cb16ffa83ce73d0993c3934e6a9a4e8966c568106bbcc5373f6eb798a9d4d788"} Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.416973 4740 generic.go:334] "Generic (PLEG): container finished" podID="63b41b33-b5bf-418b-b5c5-595e952f1380" containerID="8472ad756cad61d21dcee5cb12bc9cc73223e9f11a71ed3979e4ba5b12b239af" exitCode=0 Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.417004 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" event={"ID":"63b41b33-b5bf-418b-b5c5-595e952f1380","Type":"ContainerDied","Data":"8472ad756cad61d21dcee5cb12bc9cc73223e9f11a71ed3979e4ba5b12b239af"} Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.418881 4740 generic.go:334] "Generic (PLEG): container finished" podID="25773a1d-61fa-499b-a0b5-43dfdaafa421" containerID="9bf43a34cde5a558352caf73d932ff7decda5087803faf196b44cdd9ea50b484" exitCode=0 Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.418953 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rm8zj" event={"ID":"25773a1d-61fa-499b-a0b5-43dfdaafa421","Type":"ContainerDied","Data":"9bf43a34cde5a558352caf73d932ff7decda5087803faf196b44cdd9ea50b484"} Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.420773 4740 generic.go:334] "Generic (PLEG): container finished" podID="220854d5-c397-408e-b4f1-4f5c7a9ab8b2" containerID="2e79f741f4173c037d418f97c730da16b0bba6d15f5f73c4a6341b4a76f3ad1f" exitCode=0 Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.420839 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-98ff-account-create-update-48ls5" event={"ID":"220854d5-c397-408e-b4f1-4f5c7a9ab8b2","Type":"ContainerDied","Data":"2e79f741f4173c037d418f97c730da16b0bba6d15f5f73c4a6341b4a76f3ad1f"} Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.892362 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hr9pm" podUID="da00ff66-0241-449f-9ceb-9c9849d5f646" containerName="ovn-controller" probeResult="failure" output=< Jan 05 14:07:44 crc kubenswrapper[4740]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 05 14:07:44 crc kubenswrapper[4740]: > Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.944953 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:07:44 crc kubenswrapper[4740]: I0105 14:07:44.971475 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.016625 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zgtw8" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.081764 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25773a1d-61fa-499b-a0b5-43dfdaafa421-operator-scripts\") pod \"25773a1d-61fa-499b-a0b5-43dfdaafa421\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.081979 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbddx\" (UniqueName: \"kubernetes.io/projected/25773a1d-61fa-499b-a0b5-43dfdaafa421-kube-api-access-dbddx\") pod \"25773a1d-61fa-499b-a0b5-43dfdaafa421\" (UID: \"25773a1d-61fa-499b-a0b5-43dfdaafa421\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.084870 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25773a1d-61fa-499b-a0b5-43dfdaafa421-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25773a1d-61fa-499b-a0b5-43dfdaafa421" (UID: "25773a1d-61fa-499b-a0b5-43dfdaafa421"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.106957 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25773a1d-61fa-499b-a0b5-43dfdaafa421-kube-api-access-dbddx" (OuterVolumeSpecName: "kube-api-access-dbddx") pod "25773a1d-61fa-499b-a0b5-43dfdaafa421" (UID: "25773a1d-61fa-499b-a0b5-43dfdaafa421"). InnerVolumeSpecName "kube-api-access-dbddx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.184833 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25773a1d-61fa-499b-a0b5-43dfdaafa421-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.184873 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbddx\" (UniqueName: \"kubernetes.io/projected/25773a1d-61fa-499b-a0b5-43dfdaafa421-kube-api-access-dbddx\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.261031 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.273035 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.278427 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.387184 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/236225ce-ccde-40de-a45f-c19e52ece918-operator-scripts\") pod \"236225ce-ccde-40de-a45f-c19e52ece918\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.387252 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hlkm\" (UniqueName: \"kubernetes.io/projected/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-kube-api-access-7hlkm\") pod \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.387274 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77686b86-ebf6-47a1-9018-a32f1a089def-operator-scripts\") pod \"77686b86-ebf6-47a1-9018-a32f1a089def\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.387327 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnxjs\" (UniqueName: \"kubernetes.io/projected/236225ce-ccde-40de-a45f-c19e52ece918-kube-api-access-nnxjs\") pod \"236225ce-ccde-40de-a45f-c19e52ece918\" (UID: \"236225ce-ccde-40de-a45f-c19e52ece918\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.387357 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7qq\" (UniqueName: \"kubernetes.io/projected/77686b86-ebf6-47a1-9018-a32f1a089def-kube-api-access-kt7qq\") pod \"77686b86-ebf6-47a1-9018-a32f1a089def\" (UID: \"77686b86-ebf6-47a1-9018-a32f1a089def\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.387538 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-operator-scripts\") pod \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\" (UID: \"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889\") " Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.388342 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" (UID: "c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.388445 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77686b86-ebf6-47a1-9018-a32f1a089def-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77686b86-ebf6-47a1-9018-a32f1a089def" (UID: "77686b86-ebf6-47a1-9018-a32f1a089def"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.388486 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236225ce-ccde-40de-a45f-c19e52ece918-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "236225ce-ccde-40de-a45f-c19e52ece918" (UID: "236225ce-ccde-40de-a45f-c19e52ece918"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.390885 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-kube-api-access-7hlkm" (OuterVolumeSpecName: "kube-api-access-7hlkm") pod "c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" (UID: "c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889"). InnerVolumeSpecName "kube-api-access-7hlkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.391141 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236225ce-ccde-40de-a45f-c19e52ece918-kube-api-access-nnxjs" (OuterVolumeSpecName: "kube-api-access-nnxjs") pod "236225ce-ccde-40de-a45f-c19e52ece918" (UID: "236225ce-ccde-40de-a45f-c19e52ece918"). InnerVolumeSpecName "kube-api-access-nnxjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.392105 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77686b86-ebf6-47a1-9018-a32f1a089def-kube-api-access-kt7qq" (OuterVolumeSpecName: "kube-api-access-kt7qq") pod "77686b86-ebf6-47a1-9018-a32f1a089def" (UID: "77686b86-ebf6-47a1-9018-a32f1a089def"). InnerVolumeSpecName "kube-api-access-kt7qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.436380 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rm8zj" event={"ID":"25773a1d-61fa-499b-a0b5-43dfdaafa421","Type":"ContainerDied","Data":"b7f947fa252aba1b6b15b8f34c86bc87e9feb5b81da8a3797e5a00892337661c"} Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.436436 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7f947fa252aba1b6b15b8f34c86bc87e9feb5b81da8a3797e5a00892337661c" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.436508 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rm8zj" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.450543 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hr9pm-config-57ptm"] Jan 05 14:07:45 crc kubenswrapper[4740]: E0105 14:07:45.450984 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236225ce-ccde-40de-a45f-c19e52ece918" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451008 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="236225ce-ccde-40de-a45f-c19e52ece918" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451021 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" Jan 05 14:07:45 crc kubenswrapper[4740]: E0105 14:07:45.451030 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77686b86-ebf6-47a1-9018-a32f1a089def" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451141 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="77686b86-ebf6-47a1-9018-a32f1a089def" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: E0105 14:07:45.451220 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451245 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: E0105 14:07:45.451281 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25773a1d-61fa-499b-a0b5-43dfdaafa421" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451287 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="25773a1d-61fa-499b-a0b5-43dfdaafa421" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451694 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451726 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="25773a1d-61fa-499b-a0b5-43dfdaafa421" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451739 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="77686b86-ebf6-47a1-9018-a32f1a089def" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.451752 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="236225ce-ccde-40de-a45f-c19e52ece918" containerName="mariadb-database-create" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.452493 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-fbfrn" event={"ID":"c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889","Type":"ContainerDied","Data":"e28b0647dfb078c58b9724e7d4a557d6d5517bd60d586f185368c866fd434589"} Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.452530 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28b0647dfb078c58b9724e7d4a557d6d5517bd60d586f185368c866fd434589" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.452610 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.466878 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-v4lk2" event={"ID":"236225ce-ccde-40de-a45f-c19e52ece918","Type":"ContainerDied","Data":"227ec86b90b25c6386e8ba1a1b1a4c549f21bb1bfa55a0f896d3cb0dbf4838f2"} Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.466927 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="227ec86b90b25c6386e8ba1a1b1a4c549f21bb1bfa55a0f896d3cb0dbf4838f2" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.467003 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-v4lk2" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.470162 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.472715 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wmlbx" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.473440 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wmlbx" event={"ID":"77686b86-ebf6-47a1-9018-a32f1a089def","Type":"ContainerDied","Data":"483bfc20e54efa1d9a81a5d623591babec384e3e7ac20ddf4ef2a87182a45a32"} Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.473487 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="483bfc20e54efa1d9a81a5d623591babec384e3e7ac20ddf4ef2a87182a45a32" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499468 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499495 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/236225ce-ccde-40de-a45f-c19e52ece918-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499529 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hlkm\" (UniqueName: \"kubernetes.io/projected/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889-kube-api-access-7hlkm\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499541 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77686b86-ebf6-47a1-9018-a32f1a089def-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499551 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnxjs\" (UniqueName: \"kubernetes.io/projected/236225ce-ccde-40de-a45f-c19e52ece918-kube-api-access-nnxjs\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499560 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7qq\" (UniqueName: \"kubernetes.io/projected/77686b86-ebf6-47a1-9018-a32f1a089def-kube-api-access-kt7qq\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.499613 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hr9pm-config-57ptm"] Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.601387 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.601744 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-log-ovn\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.601857 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-additional-scripts\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.601994 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fbx\" (UniqueName: \"kubernetes.io/projected/4614eef6-6796-4173-923d-c8d22ec12d21-kube-api-access-98fbx\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.602618 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-scripts\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.602685 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run-ovn\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.704908 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705008 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-log-ovn\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705157 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-additional-scripts\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705205 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fbx\" (UniqueName: \"kubernetes.io/projected/4614eef6-6796-4173-923d-c8d22ec12d21-kube-api-access-98fbx\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705283 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-scripts\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-log-ovn\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705307 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run-ovn\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.705378 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run-ovn\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.706630 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-additional-scripts\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.707611 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-scripts\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.727814 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fbx\" (UniqueName: \"kubernetes.io/projected/4614eef6-6796-4173-923d-c8d22ec12d21-kube-api-access-98fbx\") pod \"ovn-controller-hr9pm-config-57ptm\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:45 crc kubenswrapper[4740]: I0105 14:07:45.803559 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.048081 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.119139 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jgsg\" (UniqueName: \"kubernetes.io/projected/63b41b33-b5bf-418b-b5c5-595e952f1380-kube-api-access-7jgsg\") pod \"63b41b33-b5bf-418b-b5c5-595e952f1380\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.119188 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63b41b33-b5bf-418b-b5c5-595e952f1380-operator-scripts\") pod \"63b41b33-b5bf-418b-b5c5-595e952f1380\" (UID: \"63b41b33-b5bf-418b-b5c5-595e952f1380\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.121354 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63b41b33-b5bf-418b-b5c5-595e952f1380-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63b41b33-b5bf-418b-b5c5-595e952f1380" (UID: "63b41b33-b5bf-418b-b5c5-595e952f1380"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.130567 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b41b33-b5bf-418b-b5c5-595e952f1380-kube-api-access-7jgsg" (OuterVolumeSpecName: "kube-api-access-7jgsg") pod "63b41b33-b5bf-418b-b5c5-595e952f1380" (UID: "63b41b33-b5bf-418b-b5c5-595e952f1380"). InnerVolumeSpecName "kube-api-access-7jgsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.188346 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.195669 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.211991 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.223105 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jgsg\" (UniqueName: \"kubernetes.io/projected/63b41b33-b5bf-418b-b5c5-595e952f1380-kube-api-access-7jgsg\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.223129 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63b41b33-b5bf-418b-b5c5-595e952f1380-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.324613 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-operator-scripts\") pod \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.324709 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqc92\" (UniqueName: \"kubernetes.io/projected/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-kube-api-access-vqc92\") pod \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.324730 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2589e83-7430-45b3-95ad-a91a164948a6-operator-scripts\") pod \"c2589e83-7430-45b3-95ad-a91a164948a6\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.324763 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sm4t\" (UniqueName: \"kubernetes.io/projected/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-kube-api-access-9sm4t\") pod \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\" (UID: \"220854d5-c397-408e-b4f1-4f5c7a9ab8b2\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.324864 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-operator-scripts\") pod \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\" (UID: \"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.324933 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpm92\" (UniqueName: \"kubernetes.io/projected/c2589e83-7430-45b3-95ad-a91a164948a6-kube-api-access-dpm92\") pod \"c2589e83-7430-45b3-95ad-a91a164948a6\" (UID: \"c2589e83-7430-45b3-95ad-a91a164948a6\") " Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.325121 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2589e83-7430-45b3-95ad-a91a164948a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2589e83-7430-45b3-95ad-a91a164948a6" (UID: "c2589e83-7430-45b3-95ad-a91a164948a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.325168 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "220854d5-c397-408e-b4f1-4f5c7a9ab8b2" (UID: "220854d5-c397-408e-b4f1-4f5c7a9ab8b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.325487 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.325505 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2589e83-7430-45b3-95ad-a91a164948a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.325504 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" (UID: "e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.328491 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2589e83-7430-45b3-95ad-a91a164948a6-kube-api-access-dpm92" (OuterVolumeSpecName: "kube-api-access-dpm92") pod "c2589e83-7430-45b3-95ad-a91a164948a6" (UID: "c2589e83-7430-45b3-95ad-a91a164948a6"). InnerVolumeSpecName "kube-api-access-dpm92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.328633 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-kube-api-access-9sm4t" (OuterVolumeSpecName: "kube-api-access-9sm4t") pod "220854d5-c397-408e-b4f1-4f5c7a9ab8b2" (UID: "220854d5-c397-408e-b4f1-4f5c7a9ab8b2"). InnerVolumeSpecName "kube-api-access-9sm4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.328710 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-kube-api-access-vqc92" (OuterVolumeSpecName: "kube-api-access-vqc92") pod "e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" (UID: "e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00"). InnerVolumeSpecName "kube-api-access-vqc92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.418159 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hr9pm-config-57ptm"] Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.431583 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqc92\" (UniqueName: \"kubernetes.io/projected/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-kube-api-access-vqc92\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.431617 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sm4t\" (UniqueName: \"kubernetes.io/projected/220854d5-c397-408e-b4f1-4f5c7a9ab8b2-kube-api-access-9sm4t\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.431628 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.431639 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpm92\" (UniqueName: \"kubernetes.io/projected/c2589e83-7430-45b3-95ad-a91a164948a6-kube-api-access-dpm92\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.468587 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9czq7"] Jan 05 14:07:46 crc kubenswrapper[4740]: E0105 14:07:46.469236 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b41b33-b5bf-418b-b5c5-595e952f1380" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469255 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b41b33-b5bf-418b-b5c5-595e952f1380" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: E0105 14:07:46.469286 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220854d5-c397-408e-b4f1-4f5c7a9ab8b2" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469295 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="220854d5-c397-408e-b4f1-4f5c7a9ab8b2" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: E0105 14:07:46.469307 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469317 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: E0105 14:07:46.469344 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2589e83-7430-45b3-95ad-a91a164948a6" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469352 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2589e83-7430-45b3-95ad-a91a164948a6" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469617 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469636 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2589e83-7430-45b3-95ad-a91a164948a6" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469647 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="220854d5-c397-408e-b4f1-4f5c7a9ab8b2" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.469669 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b41b33-b5bf-418b-b5c5-595e952f1380" containerName="mariadb-account-create-update" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.470652 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.477394 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9czq7"] Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.483555 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.495288 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-98ff-account-create-update-48ls5" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.496278 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-98ff-account-create-update-48ls5" event={"ID":"220854d5-c397-408e-b4f1-4f5c7a9ab8b2","Type":"ContainerDied","Data":"5e0e1e54625058361284319db97bfabcd1b4efb80c870e8de09eb9178e040f2b"} Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.496323 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0e1e54625058361284319db97bfabcd1b4efb80c870e8de09eb9178e040f2b" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.498522 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5361-account-create-update-m2b2w" event={"ID":"c2589e83-7430-45b3-95ad-a91a164948a6","Type":"ContainerDied","Data":"26e1f3db533d8bd49ba932d3c370b45593ba196d732f65ec21af9db3c592c36f"} Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.498594 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e1f3db533d8bd49ba932d3c370b45593ba196d732f65ec21af9db3c592c36f" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.498665 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5361-account-create-update-m2b2w" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.513597 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8281-account-create-update-fmlwx" event={"ID":"e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00","Type":"ContainerDied","Data":"02378e5c8beebda32e97c75a5607dd9ebd52f377a20d70dcadc8ea64b35cf5a6"} Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.513635 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02378e5c8beebda32e97c75a5607dd9ebd52f377a20d70dcadc8ea64b35cf5a6" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.513715 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8281-account-create-update-fmlwx" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.519572 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" event={"ID":"63b41b33-b5bf-418b-b5c5-595e952f1380","Type":"ContainerDied","Data":"84d3360168215b2a1ee86021954cf4aed302aaedd77586d7a8eff4a2525a3f1b"} Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.519624 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84d3360168215b2a1ee86021954cf4aed302aaedd77586d7a8eff4a2525a3f1b" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.519678 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-506a-account-create-update-w45h2" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.526021 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-57ptm" event={"ID":"4614eef6-6796-4173-923d-c8d22ec12d21","Type":"ContainerStarted","Data":"924d8aeb385b0f6270bd7e623d6c115c4c46d3ac7b66e4e292b632ce5618d487"} Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.533516 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l5r6\" (UniqueName: \"kubernetes.io/projected/826861b3-ea51-459e-8cf1-6a836e8433b0-kube-api-access-4l5r6\") pod \"root-account-create-update-9czq7\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.533738 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826861b3-ea51-459e-8cf1-6a836e8433b0-operator-scripts\") pod \"root-account-create-update-9czq7\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.636496 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l5r6\" (UniqueName: \"kubernetes.io/projected/826861b3-ea51-459e-8cf1-6a836e8433b0-kube-api-access-4l5r6\") pod \"root-account-create-update-9czq7\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.636666 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826861b3-ea51-459e-8cf1-6a836e8433b0-operator-scripts\") pod \"root-account-create-update-9czq7\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.637498 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826861b3-ea51-459e-8cf1-6a836e8433b0-operator-scripts\") pod \"root-account-create-update-9czq7\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.654702 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l5r6\" (UniqueName: \"kubernetes.io/projected/826861b3-ea51-459e-8cf1-6a836e8433b0-kube-api-access-4l5r6\") pod \"root-account-create-update-9czq7\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.696879 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.697308 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="thanos-sidecar" containerID="cri-o://c87eb6635019a99f8f7e55591534536b05ba8a0c168b1758e8cf26729197c061" gracePeriod=600 Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.697356 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="config-reloader" containerID="cri-o://1e18d3a01fd61abd767af8db666a3962b57b0a7bd221181f75df6fdce6d0a5c0" gracePeriod=600 Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.697270 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="prometheus" containerID="cri-o://4a75b8e7c93c609a1897185d6697b21e1f68678db7e22a100668ce9bbd67f312" gracePeriod=600 Jan 05 14:07:46 crc kubenswrapper[4740]: I0105 14:07:46.797865 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.291741 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9czq7"] Jan 05 14:07:47 crc kubenswrapper[4740]: W0105 14:07:47.293358 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod826861b3_ea51_459e_8cf1_6a836e8433b0.slice/crio-c7638b4f2fb1cf132060cf0b7c897ffc6ee0d2b65890d8c6e7ad2f7e68724a3e WatchSource:0}: Error finding container c7638b4f2fb1cf132060cf0b7c897ffc6ee0d2b65890d8c6e7ad2f7e68724a3e: Status 404 returned error can't find the container with id c7638b4f2fb1cf132060cf0b7c897ffc6ee0d2b65890d8c6e7ad2f7e68724a3e Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.543035 4740 generic.go:334] "Generic (PLEG): container finished" podID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerID="c87eb6635019a99f8f7e55591534536b05ba8a0c168b1758e8cf26729197c061" exitCode=0 Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.543401 4740 generic.go:334] "Generic (PLEG): container finished" podID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerID="1e18d3a01fd61abd767af8db666a3962b57b0a7bd221181f75df6fdce6d0a5c0" exitCode=0 Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.543410 4740 generic.go:334] "Generic (PLEG): container finished" podID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerID="4a75b8e7c93c609a1897185d6697b21e1f68678db7e22a100668ce9bbd67f312" exitCode=0 Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.543453 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerDied","Data":"c87eb6635019a99f8f7e55591534536b05ba8a0c168b1758e8cf26729197c061"} Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.543483 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerDied","Data":"1e18d3a01fd61abd767af8db666a3962b57b0a7bd221181f75df6fdce6d0a5c0"} Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.543497 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerDied","Data":"4a75b8e7c93c609a1897185d6697b21e1f68678db7e22a100668ce9bbd67f312"} Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.556662 4740 generic.go:334] "Generic (PLEG): container finished" podID="4614eef6-6796-4173-923d-c8d22ec12d21" containerID="1f37daa9eb9780e6ce82b90be0b0b660cb628f160ae46c97b52ca28948c80008" exitCode=0 Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.556757 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-57ptm" event={"ID":"4614eef6-6796-4173-923d-c8d22ec12d21","Type":"ContainerDied","Data":"1f37daa9eb9780e6ce82b90be0b0b660cb628f160ae46c97b52ca28948c80008"} Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.561585 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9czq7" event={"ID":"826861b3-ea51-459e-8cf1-6a836e8433b0","Type":"ContainerStarted","Data":"94c7df84480fd76a5119a275577c87912d242bc8d1522e50a9dbfcc6aa200f59"} Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.561627 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9czq7" event={"ID":"826861b3-ea51-459e-8cf1-6a836e8433b0","Type":"ContainerStarted","Data":"c7638b4f2fb1cf132060cf0b7c897ffc6ee0d2b65890d8c6e7ad2f7e68724a3e"} Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.596501 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9czq7" podStartSLOduration=1.596482241 podStartE2EDuration="1.596482241s" podCreationTimestamp="2026-01-05 14:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:47.591148779 +0000 UTC m=+1116.898057358" watchObservedRunningTime="2026-01-05 14:07:47.596482241 +0000 UTC m=+1116.903390830" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.695608 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.762272 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.762667 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-2\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.762702 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-thanos-prometheus-http-client-file\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.762898 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.762968 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-web-config\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.763029 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config-out\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.763052 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-tls-assets\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.763082 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjv2r\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-kube-api-access-hjv2r\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.763177 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-0\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.763239 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-1\") pod \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\" (UID: \"cfa6a7e6-018b-43df-8084-48e1aa10e2ca\") " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.765844 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.766198 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.767706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.771209 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.775600 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config" (OuterVolumeSpecName: "config") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.778510 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-kube-api-access-hjv2r" (OuterVolumeSpecName: "kube-api-access-hjv2r") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "kube-api-access-hjv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.779512 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.785198 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config-out" (OuterVolumeSpecName: "config-out") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.793280 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-web-config" (OuterVolumeSpecName: "web-config") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.801731 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "cfa6a7e6-018b-43df-8084-48e1aa10e2ca" (UID: "cfa6a7e6-018b-43df-8084-48e1aa10e2ca"). InnerVolumeSpecName "pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865458 4740 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865493 4740 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865503 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865514 4740 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865527 4740 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865556 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") on node \"crc\" " Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865567 4740 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-web-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865576 4740 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-config-out\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865584 4740 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.865592 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjv2r\" (UniqueName: \"kubernetes.io/projected/cfa6a7e6-018b-43df-8084-48e1aa10e2ca-kube-api-access-hjv2r\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.892120 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.892261 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f") on node "crc" Jan 05 14:07:47 crc kubenswrapper[4740]: I0105 14:07:47.967918 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.576909 4740 generic.go:334] "Generic (PLEG): container finished" podID="826861b3-ea51-459e-8cf1-6a836e8433b0" containerID="94c7df84480fd76a5119a275577c87912d242bc8d1522e50a9dbfcc6aa200f59" exitCode=0 Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.577031 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9czq7" event={"ID":"826861b3-ea51-459e-8cf1-6a836e8433b0","Type":"ContainerDied","Data":"94c7df84480fd76a5119a275577c87912d242bc8d1522e50a9dbfcc6aa200f59"} Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.581526 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cfa6a7e6-018b-43df-8084-48e1aa10e2ca","Type":"ContainerDied","Data":"f9684b14067f4469a6f9b57c3e3dc1bc93cc9225d3789de988c42ab411787cf7"} Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.581630 4740 scope.go:117] "RemoveContainer" containerID="c87eb6635019a99f8f7e55591534536b05ba8a0c168b1758e8cf26729197c061" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.581699 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.618890 4740 scope.go:117] "RemoveContainer" containerID="1e18d3a01fd61abd767af8db666a3962b57b0a7bd221181f75df6fdce6d0a5c0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.656996 4740 scope.go:117] "RemoveContainer" containerID="4a75b8e7c93c609a1897185d6697b21e1f68678db7e22a100668ce9bbd67f312" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.661411 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.673482 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.702782 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:48 crc kubenswrapper[4740]: E0105 14:07:48.703435 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="config-reloader" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703454 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="config-reloader" Jan 05 14:07:48 crc kubenswrapper[4740]: E0105 14:07:48.703476 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="prometheus" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703490 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="prometheus" Jan 05 14:07:48 crc kubenswrapper[4740]: E0105 14:07:48.703511 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="init-config-reloader" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703522 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="init-config-reloader" Jan 05 14:07:48 crc kubenswrapper[4740]: E0105 14:07:48.703541 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="thanos-sidecar" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703550 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="thanos-sidecar" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703869 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="prometheus" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703892 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="thanos-sidecar" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.703916 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" containerName="config-reloader" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.706359 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.729631 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-b78ns" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.730128 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.730317 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.730477 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.730698 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.730894 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.731076 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.731125 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.731245 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.739665 4740 scope.go:117] "RemoveContainer" containerID="0b97cb304c7443a8f612ae0939ddd0a643468732302ad8bbdc9539057e6d4f8a" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.756379 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.819884 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.819971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820085 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820118 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820224 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820259 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820307 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a87008bf-295c-4343-a6b2-f3fd37fa581d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820342 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a87008bf-295c-4343-a6b2-f3fd37fa581d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820395 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820745 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7bgk\" (UniqueName: \"kubernetes.io/projected/a87008bf-295c-4343-a6b2-f3fd37fa581d-kube-api-access-s7bgk\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820805 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.820884 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923101 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a87008bf-295c-4343-a6b2-f3fd37fa581d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923150 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a87008bf-295c-4343-a6b2-f3fd37fa581d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923196 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923226 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7bgk\" (UniqueName: \"kubernetes.io/projected/a87008bf-295c-4343-a6b2-f3fd37fa581d-kube-api-access-s7bgk\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923281 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923332 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923365 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923391 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923416 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923445 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923472 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923494 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.923513 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.924942 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.924990 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.926291 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a87008bf-295c-4343-a6b2-f3fd37fa581d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.937495 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.937759 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4467b90e9c5a99a8934af67251ce6e91a7d02d2690b27a0634fda785dce76400/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.938790 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.939505 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.939633 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a87008bf-295c-4343-a6b2-f3fd37fa581d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.939806 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.939745 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.942005 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a87008bf-295c-4343-a6b2-f3fd37fa581d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.942262 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.944851 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7bgk\" (UniqueName: \"kubernetes.io/projected/a87008bf-295c-4343-a6b2-f3fd37fa581d-kube-api-access-s7bgk\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.955466 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a87008bf-295c-4343-a6b2-f3fd37fa581d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.988995 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa6a7e6-018b-43df-8084-48e1aa10e2ca" path="/var/lib/kubelet/pods/cfa6a7e6-018b-43df-8084-48e1aa10e2ca/volumes" Jan 05 14:07:48 crc kubenswrapper[4740]: I0105 14:07:48.996014 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29147995-b4f5-4772-a933-ff9a3f5fde8f\") pod \"prometheus-metric-storage-0\" (UID: \"a87008bf-295c-4343-a6b2-f3fd37fa581d\") " pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.051696 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.245696 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332137 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-log-ovn\") pod \"4614eef6-6796-4173-923d-c8d22ec12d21\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332201 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run\") pod \"4614eef6-6796-4173-923d-c8d22ec12d21\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332225 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run-ovn\") pod \"4614eef6-6796-4173-923d-c8d22ec12d21\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332322 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4614eef6-6796-4173-923d-c8d22ec12d21" (UID: "4614eef6-6796-4173-923d-c8d22ec12d21"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332356 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-additional-scripts\") pod \"4614eef6-6796-4173-923d-c8d22ec12d21\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332366 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run" (OuterVolumeSpecName: "var-run") pod "4614eef6-6796-4173-923d-c8d22ec12d21" (UID: "4614eef6-6796-4173-923d-c8d22ec12d21"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332389 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4614eef6-6796-4173-923d-c8d22ec12d21" (UID: "4614eef6-6796-4173-923d-c8d22ec12d21"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332391 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-scripts\") pod \"4614eef6-6796-4173-923d-c8d22ec12d21\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.332513 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98fbx\" (UniqueName: \"kubernetes.io/projected/4614eef6-6796-4173-923d-c8d22ec12d21-kube-api-access-98fbx\") pod \"4614eef6-6796-4173-923d-c8d22ec12d21\" (UID: \"4614eef6-6796-4173-923d-c8d22ec12d21\") " Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.333086 4740 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.333107 4740 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.333119 4740 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4614eef6-6796-4173-923d-c8d22ec12d21-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.333351 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-scripts" (OuterVolumeSpecName: "scripts") pod "4614eef6-6796-4173-923d-c8d22ec12d21" (UID: "4614eef6-6796-4173-923d-c8d22ec12d21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.334087 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4614eef6-6796-4173-923d-c8d22ec12d21" (UID: "4614eef6-6796-4173-923d-c8d22ec12d21"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.337746 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4614eef6-6796-4173-923d-c8d22ec12d21-kube-api-access-98fbx" (OuterVolumeSpecName: "kube-api-access-98fbx") pod "4614eef6-6796-4173-923d-c8d22ec12d21" (UID: "4614eef6-6796-4173-923d-c8d22ec12d21"). InnerVolumeSpecName "kube-api-access-98fbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.435464 4740 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.435502 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4614eef6-6796-4173-923d-c8d22ec12d21-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.435516 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98fbx\" (UniqueName: \"kubernetes.io/projected/4614eef6-6796-4173-923d-c8d22ec12d21-kube-api-access-98fbx\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.590843 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 05 14:07:49 crc kubenswrapper[4740]: W0105 14:07:49.593311 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda87008bf_295c_4343_a6b2_f3fd37fa581d.slice/crio-0357aaeb399e428daf65aea5ad939845507e2b3f6b92697a3e56bd49b5a1a40a WatchSource:0}: Error finding container 0357aaeb399e428daf65aea5ad939845507e2b3f6b92697a3e56bd49b5a1a40a: Status 404 returned error can't find the container with id 0357aaeb399e428daf65aea5ad939845507e2b3f6b92697a3e56bd49b5a1a40a Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.597666 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-57ptm" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.597679 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-57ptm" event={"ID":"4614eef6-6796-4173-923d-c8d22ec12d21","Type":"ContainerDied","Data":"924d8aeb385b0f6270bd7e623d6c115c4c46d3ac7b66e4e292b632ce5618d487"} Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.597726 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924d8aeb385b0f6270bd7e623d6c115c4c46d3ac7b66e4e292b632ce5618d487" Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.611176 4740 generic.go:334] "Generic (PLEG): container finished" podID="54c80dba-b90f-4288-a366-4ff77f76db22" containerID="1c047226985f396d19c238260e8d4fcd5bd4861cf84ef028ed887a1b0fa068b9" exitCode=0 Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.611448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"54c80dba-b90f-4288-a366-4ff77f76db22","Type":"ContainerDied","Data":"1c047226985f396d19c238260e8d4fcd5bd4861cf84ef028ed887a1b0fa068b9"} Jan 05 14:07:49 crc kubenswrapper[4740]: I0105 14:07:49.846002 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hr9pm" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.047915 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.157990 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l5r6\" (UniqueName: \"kubernetes.io/projected/826861b3-ea51-459e-8cf1-6a836e8433b0-kube-api-access-4l5r6\") pod \"826861b3-ea51-459e-8cf1-6a836e8433b0\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.158260 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826861b3-ea51-459e-8cf1-6a836e8433b0-operator-scripts\") pod \"826861b3-ea51-459e-8cf1-6a836e8433b0\" (UID: \"826861b3-ea51-459e-8cf1-6a836e8433b0\") " Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.158679 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826861b3-ea51-459e-8cf1-6a836e8433b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "826861b3-ea51-459e-8cf1-6a836e8433b0" (UID: "826861b3-ea51-459e-8cf1-6a836e8433b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.159098 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826861b3-ea51-459e-8cf1-6a836e8433b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.173617 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826861b3-ea51-459e-8cf1-6a836e8433b0-kube-api-access-4l5r6" (OuterVolumeSpecName: "kube-api-access-4l5r6") pod "826861b3-ea51-459e-8cf1-6a836e8433b0" (UID: "826861b3-ea51-459e-8cf1-6a836e8433b0"). InnerVolumeSpecName "kube-api-access-4l5r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.260793 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l5r6\" (UniqueName: \"kubernetes.io/projected/826861b3-ea51-459e-8cf1-6a836e8433b0-kube-api-access-4l5r6\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.309116 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qmxnw"] Jan 05 14:07:50 crc kubenswrapper[4740]: E0105 14:07:50.309666 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4614eef6-6796-4173-923d-c8d22ec12d21" containerName="ovn-config" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.309693 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4614eef6-6796-4173-923d-c8d22ec12d21" containerName="ovn-config" Jan 05 14:07:50 crc kubenswrapper[4740]: E0105 14:07:50.309713 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826861b3-ea51-459e-8cf1-6a836e8433b0" containerName="mariadb-account-create-update" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.309723 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="826861b3-ea51-459e-8cf1-6a836e8433b0" containerName="mariadb-account-create-update" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.309969 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="826861b3-ea51-459e-8cf1-6a836e8433b0" containerName="mariadb-account-create-update" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.309996 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4614eef6-6796-4173-923d-c8d22ec12d21" containerName="ovn-config" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.310912 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.312859 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kw6zt" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.314349 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.322622 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qmxnw"] Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.399482 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hr9pm-config-57ptm"] Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.411493 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hr9pm-config-57ptm"] Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.461243 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hr9pm-config-qzl27"] Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.462866 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.468104 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.468531 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-combined-ca-bundle\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.468614 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-config-data\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.468686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-db-sync-config-data\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.468767 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bgj\" (UniqueName: \"kubernetes.io/projected/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-kube-api-access-t4bgj\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.480902 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hr9pm-config-qzl27"] Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-config-data\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570474 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-additional-scripts\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570536 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-log-ovn\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-scripts\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570613 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9v7t\" (UniqueName: \"kubernetes.io/projected/3ed98bf8-2a88-482e-9de1-8fa7972508d7-kube-api-access-m9v7t\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570662 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-db-sync-config-data\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570761 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570801 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run-ovn\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570891 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bgj\" (UniqueName: \"kubernetes.io/projected/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-kube-api-access-t4bgj\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.570929 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-combined-ca-bundle\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.575221 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-db-sync-config-data\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.575269 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-config-data\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.577684 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-combined-ca-bundle\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.590311 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bgj\" (UniqueName: \"kubernetes.io/projected/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-kube-api-access-t4bgj\") pod \"glance-db-sync-qmxnw\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.622805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a87008bf-295c-4343-a6b2-f3fd37fa581d","Type":"ContainerStarted","Data":"0357aaeb399e428daf65aea5ad939845507e2b3f6b92697a3e56bd49b5a1a40a"} Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.626384 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"54c80dba-b90f-4288-a366-4ff77f76db22","Type":"ContainerStarted","Data":"0ac551bcb4a2ac3c200ba4459a08bc635b2416c696711644844f2e4d08c36508"} Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.626684 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.629575 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qmxnw" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.631039 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9czq7" event={"ID":"826861b3-ea51-459e-8cf1-6a836e8433b0","Type":"ContainerDied","Data":"c7638b4f2fb1cf132060cf0b7c897ffc6ee0d2b65890d8c6e7ad2f7e68724a3e"} Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.631101 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7638b4f2fb1cf132060cf0b7c897ffc6ee0d2b65890d8c6e7ad2f7e68724a3e" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.631112 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9czq7" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.669286 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.070313159 podStartE2EDuration="1m15.669261404s" podCreationTimestamp="2026-01-05 14:06:35 +0000 UTC" firstStartedPulling="2026-01-05 14:06:37.83549243 +0000 UTC m=+1047.142401009" lastFinishedPulling="2026-01-05 14:07:15.434440675 +0000 UTC m=+1084.741349254" observedRunningTime="2026-01-05 14:07:50.652534916 +0000 UTC m=+1119.959443515" watchObservedRunningTime="2026-01-05 14:07:50.669261404 +0000 UTC m=+1119.976169983" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.672796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-additional-scripts\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.672892 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-log-ovn\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.672926 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-scripts\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.672952 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9v7t\" (UniqueName: \"kubernetes.io/projected/3ed98bf8-2a88-482e-9de1-8fa7972508d7-kube-api-access-m9v7t\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.673018 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.673042 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run-ovn\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.673232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-log-ovn\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.673293 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run-ovn\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.673487 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.673636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-additional-scripts\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.674782 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-scripts\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.691090 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9v7t\" (UniqueName: \"kubernetes.io/projected/3ed98bf8-2a88-482e-9de1-8fa7972508d7-kube-api-access-m9v7t\") pod \"ovn-controller-hr9pm-config-qzl27\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.782707 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:50 crc kubenswrapper[4740]: I0105 14:07:50.993941 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4614eef6-6796-4173-923d-c8d22ec12d21" path="/var/lib/kubelet/pods/4614eef6-6796-4173-923d-c8d22ec12d21/volumes" Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.329368 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hr9pm-config-qzl27"] Jan 05 14:07:51 crc kubenswrapper[4740]: W0105 14:07:51.333283 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed98bf8_2a88_482e_9de1_8fa7972508d7.slice/crio-2e3434b5632d3e8895046a235361fec1eca31e516852d1624922a2dd3ef23e60 WatchSource:0}: Error finding container 2e3434b5632d3e8895046a235361fec1eca31e516852d1624922a2dd3ef23e60: Status 404 returned error can't find the container with id 2e3434b5632d3e8895046a235361fec1eca31e516852d1624922a2dd3ef23e60 Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.342664 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qmxnw"] Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.639757 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qmxnw" event={"ID":"de1ed2b0-7aa0-43ad-a948-07e7b8de711f","Type":"ContainerStarted","Data":"af739fb54670fc8456ff40f347b31e5386f096758205d0c134c49b1db3dfa19d"} Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.641460 4740 generic.go:334] "Generic (PLEG): container finished" podID="4396968c-d77b-434d-888f-3ab578514bbe" containerID="993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1" exitCode=0 Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.641519 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4396968c-d77b-434d-888f-3ab578514bbe","Type":"ContainerDied","Data":"993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1"} Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.642889 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-qzl27" event={"ID":"3ed98bf8-2a88-482e-9de1-8fa7972508d7","Type":"ContainerStarted","Data":"2e3434b5632d3e8895046a235361fec1eca31e516852d1624922a2dd3ef23e60"} Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.849406 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd"] Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.853986 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.869448 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd"] Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.897532 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb571d2d-b11f-47e3-855a-6b684dd3937e-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sjtsd\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.897741 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwjl4\" (UniqueName: \"kubernetes.io/projected/cb571d2d-b11f-47e3-855a-6b684dd3937e-kube-api-access-fwjl4\") pod \"mysqld-exporter-openstack-cell1-db-create-sjtsd\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.999391 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwjl4\" (UniqueName: \"kubernetes.io/projected/cb571d2d-b11f-47e3-855a-6b684dd3937e-kube-api-access-fwjl4\") pod \"mysqld-exporter-openstack-cell1-db-create-sjtsd\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:51 crc kubenswrapper[4740]: I0105 14:07:51.999535 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb571d2d-b11f-47e3-855a-6b684dd3937e-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sjtsd\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.000580 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb571d2d-b11f-47e3-855a-6b684dd3937e-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sjtsd\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.019674 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwjl4\" (UniqueName: \"kubernetes.io/projected/cb571d2d-b11f-47e3-855a-6b684dd3937e-kube-api-access-fwjl4\") pod \"mysqld-exporter-openstack-cell1-db-create-sjtsd\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.059044 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-d171-account-create-update-4ckgq"] Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.060275 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.061585 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.068821 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-d171-account-create-update-4ckgq"] Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.100757 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rl8\" (UniqueName: \"kubernetes.io/projected/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-kube-api-access-b4rl8\") pod \"mysqld-exporter-d171-account-create-update-4ckgq\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.101502 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-operator-scripts\") pod \"mysqld-exporter-d171-account-create-update-4ckgq\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.203514 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-operator-scripts\") pod \"mysqld-exporter-d171-account-create-update-4ckgq\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.203609 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rl8\" (UniqueName: \"kubernetes.io/projected/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-kube-api-access-b4rl8\") pod \"mysqld-exporter-d171-account-create-update-4ckgq\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.204225 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-operator-scripts\") pod \"mysqld-exporter-d171-account-create-update-4ckgq\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.219655 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rl8\" (UniqueName: \"kubernetes.io/projected/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-kube-api-access-b4rl8\") pod \"mysqld-exporter-d171-account-create-update-4ckgq\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.276565 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.486386 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.680630 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-qzl27" event={"ID":"3ed98bf8-2a88-482e-9de1-8fa7972508d7","Type":"ContainerStarted","Data":"761f5acf52bce2076b89940bc686a118270fc0ca2adeaed1f8c5697af7c01aef"} Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.692267 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4396968c-d77b-434d-888f-3ab578514bbe","Type":"ContainerStarted","Data":"9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf"} Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.694536 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.703948 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hr9pm-config-qzl27" podStartSLOduration=2.703933246 podStartE2EDuration="2.703933246s" podCreationTimestamp="2026-01-05 14:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:52.703050602 +0000 UTC m=+1122.009959181" watchObservedRunningTime="2026-01-05 14:07:52.703933246 +0000 UTC m=+1122.010841825" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.739164 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=-9223371958.115637 podStartE2EDuration="1m18.739139699s" podCreationTimestamp="2026-01-05 14:06:34 +0000 UTC" firstStartedPulling="2026-01-05 14:06:37.500246188 +0000 UTC m=+1046.807154767" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:52.729296255 +0000 UTC m=+1122.036204844" watchObservedRunningTime="2026-01-05 14:07:52.739139699 +0000 UTC m=+1122.046048278" Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.757081 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd"] Jan 05 14:07:52 crc kubenswrapper[4740]: I0105 14:07:52.992894 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-d171-account-create-update-4ckgq"] Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.073397 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9czq7"] Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.083299 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9czq7"] Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.704927 4740 generic.go:334] "Generic (PLEG): container finished" podID="bd13c104-1d69-4bc0-86f5-c9aad7c9010a" containerID="cab7fe74becde430c00735e1d5b4e7fc02684c5356128c158c54dfc610667e3c" exitCode=0 Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.705055 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" event={"ID":"bd13c104-1d69-4bc0-86f5-c9aad7c9010a","Type":"ContainerDied","Data":"cab7fe74becde430c00735e1d5b4e7fc02684c5356128c158c54dfc610667e3c"} Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.705372 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" event={"ID":"bd13c104-1d69-4bc0-86f5-c9aad7c9010a","Type":"ContainerStarted","Data":"fdd5be3291ee90aa43104dec06abc70ae5b7f784914591fb6fee183e16874c62"} Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.708900 4740 generic.go:334] "Generic (PLEG): container finished" podID="cb571d2d-b11f-47e3-855a-6b684dd3937e" containerID="9fd2217025cd011d3f47e68738af8a02300b10d3ffae26526c476014fb590ce2" exitCode=0 Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.708978 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" event={"ID":"cb571d2d-b11f-47e3-855a-6b684dd3937e","Type":"ContainerDied","Data":"9fd2217025cd011d3f47e68738af8a02300b10d3ffae26526c476014fb590ce2"} Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.709012 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" event={"ID":"cb571d2d-b11f-47e3-855a-6b684dd3937e","Type":"ContainerStarted","Data":"ac3d88c08805912db3057f8ecba2c2afcca29b53df3eb96106217fdad071a933"} Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.711745 4740 generic.go:334] "Generic (PLEG): container finished" podID="3ed98bf8-2a88-482e-9de1-8fa7972508d7" containerID="761f5acf52bce2076b89940bc686a118270fc0ca2adeaed1f8c5697af7c01aef" exitCode=0 Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.711848 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-qzl27" event={"ID":"3ed98bf8-2a88-482e-9de1-8fa7972508d7","Type":"ContainerDied","Data":"761f5acf52bce2076b89940bc686a118270fc0ca2adeaed1f8c5697af7c01aef"} Jan 05 14:07:53 crc kubenswrapper[4740]: I0105 14:07:53.714233 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a87008bf-295c-4343-a6b2-f3fd37fa581d","Type":"ContainerStarted","Data":"b66753c99bb53672247b03636d926f2264a04fa71c4e5c8b582301c45ed37216"} Jan 05 14:07:54 crc kubenswrapper[4740]: I0105 14:07:54.988462 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826861b3-ea51-459e-8cf1-6a836e8433b0" path="/var/lib/kubelet/pods/826861b3-ea51-459e-8cf1-6a836e8433b0/volumes" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.068170 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.081925 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e1b97919-20e0-4eb9-a60b-0f52a4d7c73b-etc-swift\") pod \"swift-storage-0\" (UID: \"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b\") " pod="openstack/swift-storage-0" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.227585 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.277050 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.379792 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb571d2d-b11f-47e3-855a-6b684dd3937e-operator-scripts\") pod \"cb571d2d-b11f-47e3-855a-6b684dd3937e\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.379840 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwjl4\" (UniqueName: \"kubernetes.io/projected/cb571d2d-b11f-47e3-855a-6b684dd3937e-kube-api-access-fwjl4\") pod \"cb571d2d-b11f-47e3-855a-6b684dd3937e\" (UID: \"cb571d2d-b11f-47e3-855a-6b684dd3937e\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.381620 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb571d2d-b11f-47e3-855a-6b684dd3937e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb571d2d-b11f-47e3-855a-6b684dd3937e" (UID: "cb571d2d-b11f-47e3-855a-6b684dd3937e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.394218 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb571d2d-b11f-47e3-855a-6b684dd3937e-kube-api-access-fwjl4" (OuterVolumeSpecName: "kube-api-access-fwjl4") pod "cb571d2d-b11f-47e3-855a-6b684dd3937e" (UID: "cb571d2d-b11f-47e3-855a-6b684dd3937e"). InnerVolumeSpecName "kube-api-access-fwjl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.405255 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.406678 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.482290 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb571d2d-b11f-47e3-855a-6b684dd3937e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.482636 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwjl4\" (UniqueName: \"kubernetes.io/projected/cb571d2d-b11f-47e3-855a-6b684dd3937e-kube-api-access-fwjl4\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583441 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run-ovn\") pod \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583531 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run\") pod \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583590 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9v7t\" (UniqueName: \"kubernetes.io/projected/3ed98bf8-2a88-482e-9de1-8fa7972508d7-kube-api-access-m9v7t\") pod \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583643 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3ed98bf8-2a88-482e-9de1-8fa7972508d7" (UID: "3ed98bf8-2a88-482e-9de1-8fa7972508d7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583735 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-log-ovn\") pod \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583734 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run" (OuterVolumeSpecName: "var-run") pod "3ed98bf8-2a88-482e-9de1-8fa7972508d7" (UID: "3ed98bf8-2a88-482e-9de1-8fa7972508d7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583757 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4rl8\" (UniqueName: \"kubernetes.io/projected/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-kube-api-access-b4rl8\") pod \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583821 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-operator-scripts\") pod \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\" (UID: \"bd13c104-1d69-4bc0-86f5-c9aad7c9010a\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583868 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-additional-scripts\") pod \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.583896 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-scripts\") pod \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\" (UID: \"3ed98bf8-2a88-482e-9de1-8fa7972508d7\") " Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.584218 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3ed98bf8-2a88-482e-9de1-8fa7972508d7" (UID: "3ed98bf8-2a88-482e-9de1-8fa7972508d7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.584547 4740 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.584562 4740 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-run\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.584574 4740 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ed98bf8-2a88-482e-9de1-8fa7972508d7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.584749 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3ed98bf8-2a88-482e-9de1-8fa7972508d7" (UID: "3ed98bf8-2a88-482e-9de1-8fa7972508d7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.584933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd13c104-1d69-4bc0-86f5-c9aad7c9010a" (UID: "bd13c104-1d69-4bc0-86f5-c9aad7c9010a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.585017 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-scripts" (OuterVolumeSpecName: "scripts") pod "3ed98bf8-2a88-482e-9de1-8fa7972508d7" (UID: "3ed98bf8-2a88-482e-9de1-8fa7972508d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.589637 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed98bf8-2a88-482e-9de1-8fa7972508d7-kube-api-access-m9v7t" (OuterVolumeSpecName: "kube-api-access-m9v7t") pod "3ed98bf8-2a88-482e-9de1-8fa7972508d7" (UID: "3ed98bf8-2a88-482e-9de1-8fa7972508d7"). InnerVolumeSpecName "kube-api-access-m9v7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.590422 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-kube-api-access-b4rl8" (OuterVolumeSpecName: "kube-api-access-b4rl8") pod "bd13c104-1d69-4bc0-86f5-c9aad7c9010a" (UID: "bd13c104-1d69-4bc0-86f5-c9aad7c9010a"). InnerVolumeSpecName "kube-api-access-b4rl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.686238 4740 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.686273 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed98bf8-2a88-482e-9de1-8fa7972508d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.686283 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9v7t\" (UniqueName: \"kubernetes.io/projected/3ed98bf8-2a88-482e-9de1-8fa7972508d7-kube-api-access-m9v7t\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.686294 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4rl8\" (UniqueName: \"kubernetes.io/projected/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-kube-api-access-b4rl8\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.686306 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd13c104-1d69-4bc0-86f5-c9aad7c9010a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.764509 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" event={"ID":"cb571d2d-b11f-47e3-855a-6b684dd3937e","Type":"ContainerDied","Data":"ac3d88c08805912db3057f8ecba2c2afcca29b53df3eb96106217fdad071a933"} Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.764566 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac3d88c08805912db3057f8ecba2c2afcca29b53df3eb96106217fdad071a933" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.764636 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.776935 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hr9pm-config-qzl27" event={"ID":"3ed98bf8-2a88-482e-9de1-8fa7972508d7","Type":"ContainerDied","Data":"2e3434b5632d3e8895046a235361fec1eca31e516852d1624922a2dd3ef23e60"} Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.776985 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e3434b5632d3e8895046a235361fec1eca31e516852d1624922a2dd3ef23e60" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.777026 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hr9pm-config-qzl27" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.779930 4740 generic.go:334] "Generic (PLEG): container finished" podID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerID="b3dbdf8362bc625ca04348e9bb74d68d3f768dd9528dbaeca36d3f3a321e26f6" exitCode=0 Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.779991 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eeb4c870-b0d8-4d92-82c1-aedb35200c4b","Type":"ContainerDied","Data":"b3dbdf8362bc625ca04348e9bb74d68d3f768dd9528dbaeca36d3f3a321e26f6"} Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.802495 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" event={"ID":"bd13c104-1d69-4bc0-86f5-c9aad7c9010a","Type":"ContainerDied","Data":"fdd5be3291ee90aa43104dec06abc70ae5b7f784914591fb6fee183e16874c62"} Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.802544 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd5be3291ee90aa43104dec06abc70ae5b7f784914591fb6fee183e16874c62" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.802625 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-d171-account-create-update-4ckgq" Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.821046 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hr9pm-config-qzl27"] Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.841332 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hr9pm-config-qzl27"] Jan 05 14:07:55 crc kubenswrapper[4740]: I0105 14:07:55.917642 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 05 14:07:56 crc kubenswrapper[4740]: I0105 14:07:56.819883 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"bfb72b0be2f1e427b5344200f20bf070c38808f72f5e307dcb6f15e83761a903"} Jan 05 14:07:56 crc kubenswrapper[4740]: I0105 14:07:56.822797 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eeb4c870-b0d8-4d92-82c1-aedb35200c4b","Type":"ContainerStarted","Data":"dd161248466ee88f4ab07ec76467467275c500b0f8e46abc31cd22b51e46ddcf"} Jan 05 14:07:56 crc kubenswrapper[4740]: I0105 14:07:56.823123 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Jan 05 14:07:56 crc kubenswrapper[4740]: I0105 14:07:56.856269 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=-9223371953.998526 podStartE2EDuration="1m22.856250381s" podCreationTimestamp="2026-01-05 14:06:34 +0000 UTC" firstStartedPulling="2026-01-05 14:06:37.338911045 +0000 UTC m=+1046.645819614" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:56.848170464 +0000 UTC m=+1126.155079043" watchObservedRunningTime="2026-01-05 14:07:56.856250381 +0000 UTC m=+1126.163158960" Jan 05 14:07:56 crc kubenswrapper[4740]: I0105 14:07:56.981737 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed98bf8-2a88-482e-9de1-8fa7972508d7" path="/var/lib/kubelet/pods/3ed98bf8-2a88-482e-9de1-8fa7972508d7/volumes" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.210672 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:07:57 crc kubenswrapper[4740]: E0105 14:07:57.211149 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed98bf8-2a88-482e-9de1-8fa7972508d7" containerName="ovn-config" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.211170 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed98bf8-2a88-482e-9de1-8fa7972508d7" containerName="ovn-config" Jan 05 14:07:57 crc kubenswrapper[4740]: E0105 14:07:57.211185 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb571d2d-b11f-47e3-855a-6b684dd3937e" containerName="mariadb-database-create" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.211191 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb571d2d-b11f-47e3-855a-6b684dd3937e" containerName="mariadb-database-create" Jan 05 14:07:57 crc kubenswrapper[4740]: E0105 14:07:57.211212 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd13c104-1d69-4bc0-86f5-c9aad7c9010a" containerName="mariadb-account-create-update" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.211218 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd13c104-1d69-4bc0-86f5-c9aad7c9010a" containerName="mariadb-account-create-update" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.211422 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb571d2d-b11f-47e3-855a-6b684dd3937e" containerName="mariadb-database-create" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.211444 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd13c104-1d69-4bc0-86f5-c9aad7c9010a" containerName="mariadb-account-create-update" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.211453 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed98bf8-2a88-482e-9de1-8fa7972508d7" containerName="ovn-config" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.212192 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.214812 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.232378 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.331143 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.331248 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-config-data\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.331324 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz2xg\" (UniqueName: \"kubernetes.io/projected/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-kube-api-access-tz2xg\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.433176 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz2xg\" (UniqueName: \"kubernetes.io/projected/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-kube-api-access-tz2xg\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.433259 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.433338 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-config-data\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.465452 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.470730 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-config-data\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.474442 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz2xg\" (UniqueName: \"kubernetes.io/projected/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-kube-api-access-tz2xg\") pod \"mysqld-exporter-0\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.533633 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.899522 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"d7c3b95aa9a0e364e6eea45a45f6221c5ec2a0aa31ab898ecf4455a0517db6ec"} Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.903220 4740 generic.go:334] "Generic (PLEG): container finished" podID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerID="ec151a9ec49a8c4a5f947daa6a8a86ed246e271c624f6ccbece01f1324acdbef" exitCode=0 Jan 05 14:07:57 crc kubenswrapper[4740]: I0105 14:07:57.904193 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299","Type":"ContainerDied","Data":"ec151a9ec49a8c4a5f947daa6a8a86ed246e271c624f6ccbece01f1324acdbef"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.090134 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gpc72"] Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.173598 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.178701 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.212365 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gpc72"] Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.221959 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.256452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-operator-scripts\") pod \"root-account-create-update-gpc72\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.256585 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sglsq\" (UniqueName: \"kubernetes.io/projected/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-kube-api-access-sglsq\") pod \"root-account-create-update-gpc72\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.358401 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-operator-scripts\") pod \"root-account-create-update-gpc72\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.358524 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sglsq\" (UniqueName: \"kubernetes.io/projected/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-kube-api-access-sglsq\") pod \"root-account-create-update-gpc72\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.359153 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-operator-scripts\") pod \"root-account-create-update-gpc72\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.374995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sglsq\" (UniqueName: \"kubernetes.io/projected/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-kube-api-access-sglsq\") pod \"root-account-create-update-gpc72\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.499518 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpc72" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.913101 4740 generic.go:334] "Generic (PLEG): container finished" podID="a87008bf-295c-4343-a6b2-f3fd37fa581d" containerID="b66753c99bb53672247b03636d926f2264a04fa71c4e5c8b582301c45ed37216" exitCode=0 Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.913542 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a87008bf-295c-4343-a6b2-f3fd37fa581d","Type":"ContainerDied","Data":"b66753c99bb53672247b03636d926f2264a04fa71c4e5c8b582301c45ed37216"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.916447 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7","Type":"ContainerStarted","Data":"426ab1585700c30f90e2d09ac12f667bc3fd58186ccf6fbd196bd7462616cc1e"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.919509 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299","Type":"ContainerStarted","Data":"de7e0d44b9d033eede30158cf5b750ac579ec6dc07c237d4c01d036ff58d10fa"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.919956 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.923519 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"3577f4c7905641cbc3632dd71cb22eda9e4d3d43111eba6f175db0ccb5b7f43e"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.923561 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"49e0e665b1c9b2c55d239517db4f96fd0d40e28f12678a6cf2ca583f9babd5c9"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.923575 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"1853597bb836c24ad86908ae5c2748aed01c375bdf479db8f0ad166192f64b66"} Jan 05 14:07:58 crc kubenswrapper[4740]: I0105 14:07:58.999865 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371951.854937 podStartE2EDuration="1m24.999839059s" podCreationTimestamp="2026-01-05 14:06:34 +0000 UTC" firstStartedPulling="2026-01-05 14:06:37.333155141 +0000 UTC m=+1046.640063720" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:07:58.976905405 +0000 UTC m=+1128.283813984" watchObservedRunningTime="2026-01-05 14:07:58.999839059 +0000 UTC m=+1128.306747678" Jan 05 14:07:59 crc kubenswrapper[4740]: I0105 14:07:59.031251 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gpc72"] Jan 05 14:07:59 crc kubenswrapper[4740]: W0105 14:07:59.032726 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e WatchSource:0}: Error finding container e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e: Status 404 returned error can't find the container with id e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e Jan 05 14:07:59 crc kubenswrapper[4740]: I0105 14:07:59.938489 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpc72" event={"ID":"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25","Type":"ContainerStarted","Data":"e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e"} Jan 05 14:07:59 crc kubenswrapper[4740]: I0105 14:07:59.943170 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a87008bf-295c-4343-a6b2-f3fd37fa581d","Type":"ContainerStarted","Data":"1dfd6a5c76d784d2e8a8561a366e84bf08600abbfa2bfbbda2c79abab6b2b943"} Jan 05 14:08:06 crc kubenswrapper[4740]: I0105 14:08:06.680262 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Jan 05 14:08:06 crc kubenswrapper[4740]: I0105 14:08:06.691749 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.137:5671: connect: connection refused" Jan 05 14:08:06 crc kubenswrapper[4740]: I0105 14:08:06.771356 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.047131 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7","Type":"ContainerStarted","Data":"7619f4bb3740d69d69acd1bc1b4226a393608569026b5a98ebb69115ee39acc6"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.050468 4740 generic.go:334] "Generic (PLEG): container finished" podID="adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" containerID="982f84fd58e7b388c60b6cdcac99b85ba29bd739e28edaf988a6e07ba0d0b54c" exitCode=0 Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.050549 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpc72" event={"ID":"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25","Type":"ContainerDied","Data":"982f84fd58e7b388c60b6cdcac99b85ba29bd739e28edaf988a6e07ba0d0b54c"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.055373 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"eee144de2bda2c439149cd76bf322bddca441aeadf936ab3b08c5c2925e7746e"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.055425 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"938816b92909822354db4f08dc72f77dc1242e7b9dd743ad26c853da8f436beb"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.055440 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"98092266bc839e37acb2aaf5bf1d307d708961f04ad07e77163448165e43e010"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.055451 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"a92e64102635f68a84bbd827e258eb899ca02068416fb5d82d841543386b23f0"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.059191 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qmxnw" event={"ID":"de1ed2b0-7aa0-43ad-a948-07e7b8de711f","Type":"ContainerStarted","Data":"e2b4f37276b9fcdb3c16877c9626870f09549b6be3d7cb470634286a077a7336"} Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.075937 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.346763114 podStartE2EDuration="12.075918048s" podCreationTimestamp="2026-01-05 14:07:57 +0000 UTC" firstStartedPulling="2026-01-05 14:07:58.199735323 +0000 UTC m=+1127.506643902" lastFinishedPulling="2026-01-05 14:08:07.928890257 +0000 UTC m=+1137.235798836" observedRunningTime="2026-01-05 14:08:09.062740824 +0000 UTC m=+1138.369649403" watchObservedRunningTime="2026-01-05 14:08:09.075918048 +0000 UTC m=+1138.382826627" Jan 05 14:08:09 crc kubenswrapper[4740]: I0105 14:08:09.083794 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qmxnw" podStartSLOduration=2.597505725 podStartE2EDuration="19.083774778s" podCreationTimestamp="2026-01-05 14:07:50 +0000 UTC" firstStartedPulling="2026-01-05 14:07:51.441566496 +0000 UTC m=+1120.748475075" lastFinishedPulling="2026-01-05 14:08:07.927835529 +0000 UTC m=+1137.234744128" observedRunningTime="2026-01-05 14:08:09.078113407 +0000 UTC m=+1138.385022006" watchObservedRunningTime="2026-01-05 14:08:09.083774778 +0000 UTC m=+1138.390683367" Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.552360 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpc72" Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.655637 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sglsq\" (UniqueName: \"kubernetes.io/projected/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-kube-api-access-sglsq\") pod \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.655843 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-operator-scripts\") pod \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\" (UID: \"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25\") " Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.656595 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" (UID: "adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.665396 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-kube-api-access-sglsq" (OuterVolumeSpecName: "kube-api-access-sglsq") pod "adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" (UID: "adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25"). InnerVolumeSpecName "kube-api-access-sglsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.758488 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:10 crc kubenswrapper[4740]: I0105 14:08:10.758875 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sglsq\" (UniqueName: \"kubernetes.io/projected/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25-kube-api-access-sglsq\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:11 crc kubenswrapper[4740]: I0105 14:08:11.091033 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gpc72" event={"ID":"adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25","Type":"ContainerDied","Data":"e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e"} Jan 05 14:08:11 crc kubenswrapper[4740]: I0105 14:08:11.091754 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e" Jan 05 14:08:11 crc kubenswrapper[4740]: I0105 14:08:11.091725 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gpc72" Jan 05 14:08:11 crc kubenswrapper[4740]: I0105 14:08:11.098559 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"d5e7d4fc37f97e1e6f85530cbf311c06bdf8b9ba3168080de7759c5ca13c65ca"} Jan 05 14:08:11 crc kubenswrapper[4740]: I0105 14:08:11.098667 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"12dc1e6827ff16d34488001d48387d06779b4facfc3793dd1b43ce3260ea6183"} Jan 05 14:08:12 crc kubenswrapper[4740]: I0105 14:08:12.122455 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"7f83811c5092ccb27a86fee5f76bd7a465bfa4a5568b1215c03b1c231e9b12a3"} Jan 05 14:08:12 crc kubenswrapper[4740]: I0105 14:08:12.122735 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"0f1dd6ff010f1556daec8191eecd121feaf561d14121cdc1ac49470a7db166b9"} Jan 05 14:08:13 crc kubenswrapper[4740]: I0105 14:08:13.139851 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a87008bf-295c-4343-a6b2-f3fd37fa581d","Type":"ContainerStarted","Data":"8ec480681b3c2c1b5303b010f5890dd70e6a8b5047c2d2ba4ed934d73e7dc526"} Jan 05 14:08:13 crc kubenswrapper[4740]: I0105 14:08:13.140343 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a87008bf-295c-4343-a6b2-f3fd37fa581d","Type":"ContainerStarted","Data":"1b98b141ccc46962c5df15cb08b1a18e04c4c8a67b384b2216ea78eae4293fa4"} Jan 05 14:08:13 crc kubenswrapper[4740]: I0105 14:08:13.156889 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"b5611d79b1bc2e8bd8ef475d315ec820ec22be824870ae50521563aaaf8df785"} Jan 05 14:08:13 crc kubenswrapper[4740]: I0105 14:08:13.177653 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=25.177631346 podStartE2EDuration="25.177631346s" podCreationTimestamp="2026-01-05 14:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:13.172705204 +0000 UTC m=+1142.479613833" watchObservedRunningTime="2026-01-05 14:08:13.177631346 +0000 UTC m=+1142.484539955" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.052570 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.191240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"7ab7bb1dca8ce7b97c9e8c13d7c013d527db46df2a531325ce54e9d125329bd3"} Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.191340 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e1b97919-20e0-4eb9-a60b-0f52a4d7c73b","Type":"ContainerStarted","Data":"2514bb5eca421f15a965d304fb7bad618d7302b35c12ea00cba32e9b34610f69"} Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.278510 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.978762606 podStartE2EDuration="52.27848585s" podCreationTimestamp="2026-01-05 14:07:22 +0000 UTC" firstStartedPulling="2026-01-05 14:07:55.932602525 +0000 UTC m=+1125.239511104" lastFinishedPulling="2026-01-05 14:08:10.232325759 +0000 UTC m=+1139.539234348" observedRunningTime="2026-01-05 14:08:14.25459831 +0000 UTC m=+1143.561506969" watchObservedRunningTime="2026-01-05 14:08:14.27848585 +0000 UTC m=+1143.585394469" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.596664 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xz2j4"] Jan 05 14:08:14 crc kubenswrapper[4740]: E0105 14:08:14.597276 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" containerName="mariadb-account-create-update" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.597298 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" containerName="mariadb-account-create-update" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.597589 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" containerName="mariadb-account-create-update" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.598929 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.602894 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.612952 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xz2j4"] Jan 05 14:08:14 crc kubenswrapper[4740]: E0105 14:08:14.696118 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.800303 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.800364 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.800618 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-config\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.800766 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.800844 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.800970 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rwk\" (UniqueName: \"kubernetes.io/projected/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-kube-api-access-66rwk\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.903559 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.903902 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.904021 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-config\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.904058 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.904112 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.904156 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rwk\" (UniqueName: \"kubernetes.io/projected/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-kube-api-access-66rwk\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.904765 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.905297 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.905355 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.905726 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-config\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.906551 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:14 crc kubenswrapper[4740]: I0105 14:08:14.939123 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rwk\" (UniqueName: \"kubernetes.io/projected/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-kube-api-access-66rwk\") pod \"dnsmasq-dns-5c79d794d7-xz2j4\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:15 crc kubenswrapper[4740]: I0105 14:08:15.216458 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:15 crc kubenswrapper[4740]: I0105 14:08:15.710615 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xz2j4"] Jan 05 14:08:15 crc kubenswrapper[4740]: W0105 14:08:15.720424 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3bfc133_670f_4697_b3f3_f7ae2fdd52ae.slice/crio-24f9b95ea1000e68becc7e11443d9c4d0afe1f16c0a08096f963a4cea4cc0b98 WatchSource:0}: Error finding container 24f9b95ea1000e68becc7e11443d9c4d0afe1f16c0a08096f963a4cea4cc0b98: Status 404 returned error can't find the container with id 24f9b95ea1000e68becc7e11443d9c4d0afe1f16c0a08096f963a4cea4cc0b98 Jan 05 14:08:16 crc kubenswrapper[4740]: I0105 14:08:16.230010 4740 generic.go:334] "Generic (PLEG): container finished" podID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerID="f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44" exitCode=0 Jan 05 14:08:16 crc kubenswrapper[4740]: I0105 14:08:16.230228 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" event={"ID":"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae","Type":"ContainerDied","Data":"f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44"} Jan 05 14:08:16 crc kubenswrapper[4740]: I0105 14:08:16.230546 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" event={"ID":"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae","Type":"ContainerStarted","Data":"24f9b95ea1000e68becc7e11443d9c4d0afe1f16c0a08096f963a4cea4cc0b98"} Jan 05 14:08:16 crc kubenswrapper[4740]: E0105 14:08:16.409284 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:16 crc kubenswrapper[4740]: I0105 14:08:16.652236 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 05 14:08:16 crc kubenswrapper[4740]: I0105 14:08:16.678347 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Jan 05 14:08:16 crc kubenswrapper[4740]: I0105 14:08:16.691221 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Jan 05 14:08:17 crc kubenswrapper[4740]: I0105 14:08:17.249833 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" event={"ID":"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae","Type":"ContainerStarted","Data":"ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2"} Jan 05 14:08:17 crc kubenswrapper[4740]: I0105 14:08:17.252980 4740 generic.go:334] "Generic (PLEG): container finished" podID="de1ed2b0-7aa0-43ad-a948-07e7b8de711f" containerID="e2b4f37276b9fcdb3c16877c9626870f09549b6be3d7cb470634286a077a7336" exitCode=0 Jan 05 14:08:17 crc kubenswrapper[4740]: I0105 14:08:17.253022 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qmxnw" event={"ID":"de1ed2b0-7aa0-43ad-a948-07e7b8de711f","Type":"ContainerDied","Data":"e2b4f37276b9fcdb3c16877c9626870f09549b6be3d7cb470634286a077a7336"} Jan 05 14:08:17 crc kubenswrapper[4740]: I0105 14:08:17.294754 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" podStartSLOduration=3.2947258870000002 podStartE2EDuration="3.294725887s" podCreationTimestamp="2026-01-05 14:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:17.274437293 +0000 UTC m=+1146.581345892" watchObservedRunningTime="2026-01-05 14:08:17.294725887 +0000 UTC m=+1146.601634466" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.264265 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.759699 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mx79w"] Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.761402 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.775590 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0ef3-account-create-update-lqc7s"] Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.777092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.786193 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.796453 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mx79w"] Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.812610 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0ef3-account-create-update-lqc7s"] Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.855110 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pn5jk"] Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.857713 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.868825 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pn5jk"] Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.889726 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qmxnw" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.898269 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvg8k\" (UniqueName: \"kubernetes.io/projected/2def1143-79d4-48ec-bae4-f0b620c8bc73-kube-api-access-qvg8k\") pod \"cinder-db-create-mx79w\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.898329 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpg2z\" (UniqueName: \"kubernetes.io/projected/edb21d30-fa05-490b-b1b1-cb0f73b8e149-kube-api-access-bpg2z\") pod \"cinder-0ef3-account-create-update-lqc7s\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.898391 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2def1143-79d4-48ec-bae4-f0b620c8bc73-operator-scripts\") pod \"cinder-db-create-mx79w\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:18 crc kubenswrapper[4740]: I0105 14:08:18.898413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb21d30-fa05-490b-b1b1-cb0f73b8e149-operator-scripts\") pod \"cinder-0ef3-account-create-update-lqc7s\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:18.999730 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-config-data\") pod \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000086 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-db-sync-config-data\") pod \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000183 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4bgj\" (UniqueName: \"kubernetes.io/projected/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-kube-api-access-t4bgj\") pod \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000232 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-combined-ca-bundle\") pod \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\" (UID: \"de1ed2b0-7aa0-43ad-a948-07e7b8de711f\") " Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000626 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvg8k\" (UniqueName: \"kubernetes.io/projected/2def1143-79d4-48ec-bae4-f0b620c8bc73-kube-api-access-qvg8k\") pod \"cinder-db-create-mx79w\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000674 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37129b31-a43d-412f-878e-72729d852124-operator-scripts\") pod \"barbican-db-create-pn5jk\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000701 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpg2z\" (UniqueName: \"kubernetes.io/projected/edb21d30-fa05-490b-b1b1-cb0f73b8e149-kube-api-access-bpg2z\") pod \"cinder-0ef3-account-create-update-lqc7s\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000756 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2def1143-79d4-48ec-bae4-f0b620c8bc73-operator-scripts\") pod \"cinder-db-create-mx79w\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.000778 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb21d30-fa05-490b-b1b1-cb0f73b8e149-operator-scripts\") pod \"cinder-0ef3-account-create-update-lqc7s\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.001582 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb21d30-fa05-490b-b1b1-cb0f73b8e149-operator-scripts\") pod \"cinder-0ef3-account-create-update-lqc7s\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.001627 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2def1143-79d4-48ec-bae4-f0b620c8bc73-operator-scripts\") pod \"cinder-db-create-mx79w\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.001787 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffp6\" (UniqueName: \"kubernetes.io/projected/37129b31-a43d-412f-878e-72729d852124-kube-api-access-mffp6\") pod \"barbican-db-create-pn5jk\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.018901 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvg8k\" (UniqueName: \"kubernetes.io/projected/2def1143-79d4-48ec-bae4-f0b620c8bc73-kube-api-access-qvg8k\") pod \"cinder-db-create-mx79w\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.019372 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de1ed2b0-7aa0-43ad-a948-07e7b8de711f" (UID: "de1ed2b0-7aa0-43ad-a948-07e7b8de711f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.019447 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-kube-api-access-t4bgj" (OuterVolumeSpecName: "kube-api-access-t4bgj") pod "de1ed2b0-7aa0-43ad-a948-07e7b8de711f" (UID: "de1ed2b0-7aa0-43ad-a948-07e7b8de711f"). InnerVolumeSpecName "kube-api-access-t4bgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.019842 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpg2z\" (UniqueName: \"kubernetes.io/projected/edb21d30-fa05-490b-b1b1-cb0f73b8e149-kube-api-access-bpg2z\") pod \"cinder-0ef3-account-create-update-lqc7s\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.052545 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.071106 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-e6c0-account-create-update-tr7lk"] Jan 05 14:08:19 crc kubenswrapper[4740]: E0105 14:08:19.071545 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1ed2b0-7aa0-43ad-a948-07e7b8de711f" containerName="glance-db-sync" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.071563 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1ed2b0-7aa0-43ad-a948-07e7b8de711f" containerName="glance-db-sync" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.071773 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1ed2b0-7aa0-43ad-a948-07e7b8de711f" containerName="glance-db-sync" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.072476 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.074355 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.075990 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.103803 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mffp6\" (UniqueName: \"kubernetes.io/projected/37129b31-a43d-412f-878e-72729d852124-kube-api-access-mffp6\") pod \"barbican-db-create-pn5jk\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.103926 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37129b31-a43d-412f-878e-72729d852124-operator-scripts\") pod \"barbican-db-create-pn5jk\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.103989 4740 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.104000 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4bgj\" (UniqueName: \"kubernetes.io/projected/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-kube-api-access-t4bgj\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.116132 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37129b31-a43d-412f-878e-72729d852124-operator-scripts\") pod \"barbican-db-create-pn5jk\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.123409 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de1ed2b0-7aa0-43ad-a948-07e7b8de711f" (UID: "de1ed2b0-7aa0-43ad-a948-07e7b8de711f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.127587 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-config-data" (OuterVolumeSpecName: "config-data") pod "de1ed2b0-7aa0-43ad-a948-07e7b8de711f" (UID: "de1ed2b0-7aa0-43ad-a948-07e7b8de711f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.128157 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffp6\" (UniqueName: \"kubernetes.io/projected/37129b31-a43d-412f-878e-72729d852124-kube-api-access-mffp6\") pod \"barbican-db-create-pn5jk\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.135923 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-mljnv"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.137660 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.149185 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mljnv"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.177640 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-e6c0-account-create-update-tr7lk"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.178296 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.189610 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.219436 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac07b17f-06ac-42c1-8e1f-49f2101f4774-operator-scripts\") pod \"heat-db-create-mljnv\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.219916 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8fh\" (UniqueName: \"kubernetes.io/projected/ac07b17f-06ac-42c1-8e1f-49f2101f4774-kube-api-access-7r8fh\") pod \"heat-db-create-mljnv\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.219995 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef9f4db3-55ef-4b94-8044-d317f3b1b760-operator-scripts\") pod \"heat-e6c0-account-create-update-tr7lk\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.220415 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmr7w\" (UniqueName: \"kubernetes.io/projected/ef9f4db3-55ef-4b94-8044-d317f3b1b760-kube-api-access-mmr7w\") pod \"heat-e6c0-account-create-update-tr7lk\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.220583 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.220602 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1ed2b0-7aa0-43ad-a948-07e7b8de711f-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.230471 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d0fc-account-create-update-tc228"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.231887 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.236104 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.236752 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.257474 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d0fc-account-create-update-tc228"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.272953 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kzngh"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.276840 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.281451 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.281654 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.281806 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wktmq" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.287713 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.327769 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49xv\" (UniqueName: \"kubernetes.io/projected/e383395f-47a3-4209-9cdb-102a606d9ce2-kube-api-access-k49xv\") pod \"barbican-d0fc-account-create-update-tc228\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.327839 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e383395f-47a3-4209-9cdb-102a606d9ce2-operator-scripts\") pod \"barbican-d0fc-account-create-update-tc228\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.327890 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8fh\" (UniqueName: \"kubernetes.io/projected/ac07b17f-06ac-42c1-8e1f-49f2101f4774-kube-api-access-7r8fh\") pod \"heat-db-create-mljnv\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.327926 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef9f4db3-55ef-4b94-8044-d317f3b1b760-operator-scripts\") pod \"heat-e6c0-account-create-update-tr7lk\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.328043 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmr7w\" (UniqueName: \"kubernetes.io/projected/ef9f4db3-55ef-4b94-8044-d317f3b1b760-kube-api-access-mmr7w\") pod \"heat-e6c0-account-create-update-tr7lk\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.328210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac07b17f-06ac-42c1-8e1f-49f2101f4774-operator-scripts\") pod \"heat-db-create-mljnv\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.329266 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac07b17f-06ac-42c1-8e1f-49f2101f4774-operator-scripts\") pod \"heat-db-create-mljnv\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.331078 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kzngh"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.339134 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef9f4db3-55ef-4b94-8044-d317f3b1b760-operator-scripts\") pod \"heat-e6c0-account-create-update-tr7lk\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.406890 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qmxnw" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.412015 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qmxnw" event={"ID":"de1ed2b0-7aa0-43ad-a948-07e7b8de711f","Type":"ContainerDied","Data":"af739fb54670fc8456ff40f347b31e5386f096758205d0c134c49b1db3dfa19d"} Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.412075 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af739fb54670fc8456ff40f347b31e5386f096758205d0c134c49b1db3dfa19d" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.420863 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8fh\" (UniqueName: \"kubernetes.io/projected/ac07b17f-06ac-42c1-8e1f-49f2101f4774-kube-api-access-7r8fh\") pod \"heat-db-create-mljnv\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.424645 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.454773 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmr7w\" (UniqueName: \"kubernetes.io/projected/ef9f4db3-55ef-4b94-8044-d317f3b1b760-kube-api-access-mmr7w\") pod \"heat-e6c0-account-create-update-tr7lk\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.548155 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-combined-ca-bundle\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.568261 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svvs\" (UniqueName: \"kubernetes.io/projected/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-kube-api-access-6svvs\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.568376 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-config-data\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.587626 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49xv\" (UniqueName: \"kubernetes.io/projected/e383395f-47a3-4209-9cdb-102a606d9ce2-kube-api-access-k49xv\") pod \"barbican-d0fc-account-create-update-tc228\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.587690 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e383395f-47a3-4209-9cdb-102a606d9ce2-operator-scripts\") pod \"barbican-d0fc-account-create-update-tc228\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.588453 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e383395f-47a3-4209-9cdb-102a606d9ce2-operator-scripts\") pod \"barbican-d0fc-account-create-update-tc228\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.598355 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9k7k2"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.599929 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.630148 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9k7k2"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.633397 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49xv\" (UniqueName: \"kubernetes.io/projected/e383395f-47a3-4209-9cdb-102a606d9ce2-kube-api-access-k49xv\") pod \"barbican-d0fc-account-create-update-tc228\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.652625 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0cc4-account-create-update-k9mmk"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.654210 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.656322 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.690496 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-combined-ca-bundle\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.690614 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svvs\" (UniqueName: \"kubernetes.io/projected/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-kube-api-access-6svvs\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.690683 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-config-data\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.709027 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-combined-ca-bundle\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.717607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svvs\" (UniqueName: \"kubernetes.io/projected/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-kube-api-access-6svvs\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.718336 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-config-data\") pod \"keystone-db-sync-kzngh\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.728596 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0cc4-account-create-update-k9mmk"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.791515 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.800487 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90dd4a82-d074-47a2-a853-d7376f242c35-operator-scripts\") pod \"neutron-db-create-9k7k2\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.800562 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpkp\" (UniqueName: \"kubernetes.io/projected/90dd4a82-d074-47a2-a853-d7376f242c35-kube-api-access-pfpkp\") pod \"neutron-db-create-9k7k2\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.800599 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mr8\" (UniqueName: \"kubernetes.io/projected/a1785e28-d1f5-4985-b080-4496c9b43c6e-kube-api-access-m7mr8\") pod \"neutron-0cc4-account-create-update-k9mmk\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.800696 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1785e28-d1f5-4985-b080-4496c9b43c6e-operator-scripts\") pod \"neutron-0cc4-account-create-update-k9mmk\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.806729 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mljnv" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.829809 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xz2j4"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.861525 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.902364 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1785e28-d1f5-4985-b080-4496c9b43c6e-operator-scripts\") pod \"neutron-0cc4-account-create-update-k9mmk\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.902517 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90dd4a82-d074-47a2-a853-d7376f242c35-operator-scripts\") pod \"neutron-db-create-9k7k2\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.902558 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpkp\" (UniqueName: \"kubernetes.io/projected/90dd4a82-d074-47a2-a853-d7376f242c35-kube-api-access-pfpkp\") pod \"neutron-db-create-9k7k2\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.902585 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mr8\" (UniqueName: \"kubernetes.io/projected/a1785e28-d1f5-4985-b080-4496c9b43c6e-kube-api-access-m7mr8\") pod \"neutron-0cc4-account-create-update-k9mmk\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.903289 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1785e28-d1f5-4985-b080-4496c9b43c6e-operator-scripts\") pod \"neutron-0cc4-account-create-update-k9mmk\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.908425 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90dd4a82-d074-47a2-a853-d7376f242c35-operator-scripts\") pod \"neutron-db-create-9k7k2\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.910659 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vj867"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.912654 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.914717 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.921811 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mr8\" (UniqueName: \"kubernetes.io/projected/a1785e28-d1f5-4985-b080-4496c9b43c6e-kube-api-access-m7mr8\") pod \"neutron-0cc4-account-create-update-k9mmk\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.946365 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vj867"] Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.952887 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpkp\" (UniqueName: \"kubernetes.io/projected/90dd4a82-d074-47a2-a853-d7376f242c35-kube-api-access-pfpkp\") pod \"neutron-db-create-9k7k2\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:19 crc kubenswrapper[4740]: I0105 14:08:19.998205 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.006279 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.006384 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.006419 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.006546 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8bn\" (UniqueName: \"kubernetes.io/projected/9e329289-66fc-459b-8360-f0e8df0edcc9-kube-api-access-6w8bn\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.006574 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-config\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.006625 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.016103 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0ef3-account-create-update-lqc7s"] Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.108239 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.108536 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.108592 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.108617 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.108688 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8bn\" (UniqueName: \"kubernetes.io/projected/9e329289-66fc-459b-8360-f0e8df0edcc9-kube-api-access-6w8bn\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.108712 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-config\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.109037 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.109412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.109460 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-config\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.109612 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.109955 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.127790 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8bn\" (UniqueName: \"kubernetes.io/projected/9e329289-66fc-459b-8360-f0e8df0edcc9-kube-api-access-6w8bn\") pod \"dnsmasq-dns-5f59b8f679-vj867\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.219188 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mx79w"] Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.232231 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:20 crc kubenswrapper[4740]: W0105 14:08:20.235304 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2def1143_79d4_48ec_bae4_f0b620c8bc73.slice/crio-baddb997008eb4df652e07fde333a5ff07030255d5a5dad5291b11ec2ad9cd30 WatchSource:0}: Error finding container baddb997008eb4df652e07fde333a5ff07030255d5a5dad5291b11ec2ad9cd30: Status 404 returned error can't find the container with id baddb997008eb4df652e07fde333a5ff07030255d5a5dad5291b11ec2ad9cd30 Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.239603 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pn5jk"] Jan 05 14:08:20 crc kubenswrapper[4740]: W0105 14:08:20.242229 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37129b31_a43d_412f_878e_72729d852124.slice/crio-809f5b9c2a0a56776ad23e9d76a81b57491745f16285bbc694e329c4f3a60555 WatchSource:0}: Error finding container 809f5b9c2a0a56776ad23e9d76a81b57491745f16285bbc694e329c4f3a60555: Status 404 returned error can't find the container with id 809f5b9c2a0a56776ad23e9d76a81b57491745f16285bbc694e329c4f3a60555 Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.408398 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.430078 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0ef3-account-create-update-lqc7s" event={"ID":"edb21d30-fa05-490b-b1b1-cb0f73b8e149","Type":"ContainerStarted","Data":"37744d4eb955ba3023bc5b03bde511006dfd26d59ec64e9b2ac4a8e600079d53"} Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.432157 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mx79w" event={"ID":"2def1143-79d4-48ec-bae4-f0b620c8bc73","Type":"ContainerStarted","Data":"baddb997008eb4df652e07fde333a5ff07030255d5a5dad5291b11ec2ad9cd30"} Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.435138 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pn5jk" event={"ID":"37129b31-a43d-412f-878e-72729d852124","Type":"ContainerStarted","Data":"809f5b9c2a0a56776ad23e9d76a81b57491745f16285bbc694e329c4f3a60555"} Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.435395 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerName="dnsmasq-dns" containerID="cri-o://ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2" gracePeriod=10 Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.814805 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kzngh"] Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.836026 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0cc4-account-create-update-k9mmk"] Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.844837 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mljnv"] Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.857502 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-e6c0-account-create-update-tr7lk"] Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.866990 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d0fc-account-create-update-tc228"] Jan 05 14:08:20 crc kubenswrapper[4740]: W0105 14:08:20.874240 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac07b17f_06ac_42c1_8e1f_49f2101f4774.slice/crio-ffbc302e65050485098b6df4badb942a64925752e953fdfeab6f66a4f683468a WatchSource:0}: Error finding container ffbc302e65050485098b6df4badb942a64925752e953fdfeab6f66a4f683468a: Status 404 returned error can't find the container with id ffbc302e65050485098b6df4badb942a64925752e953fdfeab6f66a4f683468a Jan 05 14:08:20 crc kubenswrapper[4740]: W0105 14:08:20.887349 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9f4db3_55ef_4b94_8044_d317f3b1b760.slice/crio-2a3b432078db95eda3d799e7f8cd5e7c25d8466db60a93e248c126cdeec46bb7 WatchSource:0}: Error finding container 2a3b432078db95eda3d799e7f8cd5e7c25d8466db60a93e248c126cdeec46bb7: Status 404 returned error can't find the container with id 2a3b432078db95eda3d799e7f8cd5e7c25d8466db60a93e248c126cdeec46bb7 Jan 05 14:08:20 crc kubenswrapper[4740]: W0105 14:08:20.887886 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode383395f_47a3_4209_9cdb_102a606d9ce2.slice/crio-b585472a5a6f0abd9835d82ac668288cc90c0933eef962443eb68205cf8e01f2 WatchSource:0}: Error finding container b585472a5a6f0abd9835d82ac668288cc90c0933eef962443eb68205cf8e01f2: Status 404 returned error can't find the container with id b585472a5a6f0abd9835d82ac668288cc90c0933eef962443eb68205cf8e01f2 Jan 05 14:08:20 crc kubenswrapper[4740]: I0105 14:08:20.961705 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9k7k2"] Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.241806 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vj867"] Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.361550 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.465551 4740 generic.go:334] "Generic (PLEG): container finished" podID="2def1143-79d4-48ec-bae4-f0b620c8bc73" containerID="fb72cdd97defef4e104c59d230ecd53fcc97cff5342fd67e0da5c4fa2b298dec" exitCode=0 Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.465613 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mx79w" event={"ID":"2def1143-79d4-48ec-bae4-f0b620c8bc73","Type":"ContainerDied","Data":"fb72cdd97defef4e104c59d230ecd53fcc97cff5342fd67e0da5c4fa2b298dec"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.469012 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e6c0-account-create-update-tr7lk" event={"ID":"ef9f4db3-55ef-4b94-8044-d317f3b1b760","Type":"ContainerStarted","Data":"2a3b432078db95eda3d799e7f8cd5e7c25d8466db60a93e248c126cdeec46bb7"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.470958 4740 generic.go:334] "Generic (PLEG): container finished" podID="37129b31-a43d-412f-878e-72729d852124" containerID="1dfbcabc3dd9c2cf3091de5da1f5cb9da5db8b271d62adc2f1687d43ee91b281" exitCode=0 Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.471087 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pn5jk" event={"ID":"37129b31-a43d-412f-878e-72729d852124","Type":"ContainerDied","Data":"1dfbcabc3dd9c2cf3091de5da1f5cb9da5db8b271d62adc2f1687d43ee91b281"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.473013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" event={"ID":"9e329289-66fc-459b-8360-f0e8df0edcc9","Type":"ContainerStarted","Data":"72985004c9d03c1c6ff727665bb1b0f6496c634b8db8d360b2c3fabca93429e4"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.474059 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9k7k2" event={"ID":"90dd4a82-d074-47a2-a853-d7376f242c35","Type":"ContainerStarted","Data":"accb98a9203a7f05ce2e79609bfe93cb10cff9b620b78cdf6dfdde3d68281558"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.475754 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mljnv" event={"ID":"ac07b17f-06ac-42c1-8e1f-49f2101f4774","Type":"ContainerStarted","Data":"ffbc302e65050485098b6df4badb942a64925752e953fdfeab6f66a4f683468a"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.476831 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0cc4-account-create-update-k9mmk" event={"ID":"a1785e28-d1f5-4985-b080-4496c9b43c6e","Type":"ContainerStarted","Data":"faf96eff4fa361af96c721f2486b7b6285d8230f5fc9f643c0b4e2737e8a1695"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.479018 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d0fc-account-create-update-tc228" event={"ID":"e383395f-47a3-4209-9cdb-102a606d9ce2","Type":"ContainerStarted","Data":"b585472a5a6f0abd9835d82ac668288cc90c0933eef962443eb68205cf8e01f2"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.481297 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kzngh" event={"ID":"6aa5ad76-5e26-4c24-8b3a-50ff2a182523","Type":"ContainerStarted","Data":"d28624c9885e4ff39590eb796572699250a5392af767796d977f339815ab98d5"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.483302 4740 generic.go:334] "Generic (PLEG): container finished" podID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerID="ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2" exitCode=0 Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.483357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" event={"ID":"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae","Type":"ContainerDied","Data":"ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.483375 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" event={"ID":"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae","Type":"ContainerDied","Data":"24f9b95ea1000e68becc7e11443d9c4d0afe1f16c0a08096f963a4cea4cc0b98"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.483392 4740 scope.go:117] "RemoveContainer" containerID="ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.483419 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-xz2j4" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.487282 4740 generic.go:334] "Generic (PLEG): container finished" podID="edb21d30-fa05-490b-b1b1-cb0f73b8e149" containerID="6782d68fdddbaa23a295d893e21ceacc93684ba92d9b7cb6d02530071bffa21e" exitCode=0 Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.487328 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0ef3-account-create-update-lqc7s" event={"ID":"edb21d30-fa05-490b-b1b1-cb0f73b8e149","Type":"ContainerDied","Data":"6782d68fdddbaa23a295d893e21ceacc93684ba92d9b7cb6d02530071bffa21e"} Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.511126 4740 scope.go:117] "RemoveContainer" containerID="f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.542171 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rwk\" (UniqueName: \"kubernetes.io/projected/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-kube-api-access-66rwk\") pod \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.542469 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-nb\") pod \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.542552 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-svc\") pod \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.542649 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-swift-storage-0\") pod \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.542718 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-config\") pod \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.542755 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-sb\") pod \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\" (UID: \"b3bfc133-670f-4697-b3f3-f7ae2fdd52ae\") " Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.543249 4740 scope.go:117] "RemoveContainer" containerID="ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2" Jan 05 14:08:21 crc kubenswrapper[4740]: E0105 14:08:21.544124 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2\": container with ID starting with ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2 not found: ID does not exist" containerID="ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.544182 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2"} err="failed to get container status \"ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2\": rpc error: code = NotFound desc = could not find container \"ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2\": container with ID starting with ccea1ab020056224f676a29334c7607f18bae13eb7a08cab74f8341db1285de2 not found: ID does not exist" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.544241 4740 scope.go:117] "RemoveContainer" containerID="f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44" Jan 05 14:08:21 crc kubenswrapper[4740]: E0105 14:08:21.544865 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44\": container with ID starting with f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44 not found: ID does not exist" containerID="f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.544978 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44"} err="failed to get container status \"f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44\": rpc error: code = NotFound desc = could not find container \"f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44\": container with ID starting with f91309fe563029c5190fe05d9b6a845f83d284c58dcd42796b8018e1004d6a44 not found: ID does not exist" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.548582 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-kube-api-access-66rwk" (OuterVolumeSpecName: "kube-api-access-66rwk") pod "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" (UID: "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae"). InnerVolumeSpecName "kube-api-access-66rwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.648704 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rwk\" (UniqueName: \"kubernetes.io/projected/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-kube-api-access-66rwk\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.712905 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-config" (OuterVolumeSpecName: "config") pod "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" (UID: "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.747301 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" (UID: "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.750802 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.750829 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.767556 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" (UID: "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.774965 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" (UID: "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.801570 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" (UID: "b3bfc133-670f-4697-b3f3-f7ae2fdd52ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.852730 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.852766 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:21 crc kubenswrapper[4740]: I0105 14:08:21.852776 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.399976 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xz2j4"] Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.410373 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-xz2j4"] Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.498901 4740 generic.go:334] "Generic (PLEG): container finished" podID="90dd4a82-d074-47a2-a853-d7376f242c35" containerID="dc8565b5a104352f6fa87bc382b60ad7d5eced67a01b6fd19123cde70af1f693" exitCode=0 Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.499092 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9k7k2" event={"ID":"90dd4a82-d074-47a2-a853-d7376f242c35","Type":"ContainerDied","Data":"dc8565b5a104352f6fa87bc382b60ad7d5eced67a01b6fd19123cde70af1f693"} Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.503806 4740 generic.go:334] "Generic (PLEG): container finished" podID="ac07b17f-06ac-42c1-8e1f-49f2101f4774" containerID="4807b7354a9fe83590199e8beee3abe136c43f6542e227fc14701977ca03c052" exitCode=0 Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.503880 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mljnv" event={"ID":"ac07b17f-06ac-42c1-8e1f-49f2101f4774","Type":"ContainerDied","Data":"4807b7354a9fe83590199e8beee3abe136c43f6542e227fc14701977ca03c052"} Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.506422 4740 generic.go:334] "Generic (PLEG): container finished" podID="a1785e28-d1f5-4985-b080-4496c9b43c6e" containerID="1a3aaa76db1c68f48c71ebab38598b5d130c3cfd6ea367b37e2a17108f3d4720" exitCode=0 Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.506553 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0cc4-account-create-update-k9mmk" event={"ID":"a1785e28-d1f5-4985-b080-4496c9b43c6e","Type":"ContainerDied","Data":"1a3aaa76db1c68f48c71ebab38598b5d130c3cfd6ea367b37e2a17108f3d4720"} Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.521253 4740 generic.go:334] "Generic (PLEG): container finished" podID="e383395f-47a3-4209-9cdb-102a606d9ce2" containerID="344cc711f70384fa901eaf2e97f65aef64f354583b12adf8a71530c7ac23aee1" exitCode=0 Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.521399 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d0fc-account-create-update-tc228" event={"ID":"e383395f-47a3-4209-9cdb-102a606d9ce2","Type":"ContainerDied","Data":"344cc711f70384fa901eaf2e97f65aef64f354583b12adf8a71530c7ac23aee1"} Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.523846 4740 generic.go:334] "Generic (PLEG): container finished" podID="ef9f4db3-55ef-4b94-8044-d317f3b1b760" containerID="dabb4dba51eb13ff6f67a3f332098c2fe64c2d6f6b22d003a9c547e7d9f0f202" exitCode=0 Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.523917 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e6c0-account-create-update-tr7lk" event={"ID":"ef9f4db3-55ef-4b94-8044-d317f3b1b760","Type":"ContainerDied","Data":"dabb4dba51eb13ff6f67a3f332098c2fe64c2d6f6b22d003a9c547e7d9f0f202"} Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.526372 4740 generic.go:334] "Generic (PLEG): container finished" podID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerID="7045ef52d865795eb8d0b571baed744f205f5772479fa437d50c95fd3e4e3e89" exitCode=0 Jan 05 14:08:22 crc kubenswrapper[4740]: I0105 14:08:22.527570 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" event={"ID":"9e329289-66fc-459b-8360-f0e8df0edcc9","Type":"ContainerDied","Data":"7045ef52d865795eb8d0b571baed744f205f5772479fa437d50c95fd3e4e3e89"} Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:22.998723 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" path="/var/lib/kubelet/pods/b3bfc133-670f-4697-b3f3-f7ae2fdd52ae/volumes" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.119777 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.305170 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb21d30-fa05-490b-b1b1-cb0f73b8e149-operator-scripts\") pod \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.305344 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpg2z\" (UniqueName: \"kubernetes.io/projected/edb21d30-fa05-490b-b1b1-cb0f73b8e149-kube-api-access-bpg2z\") pod \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\" (UID: \"edb21d30-fa05-490b-b1b1-cb0f73b8e149\") " Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.306610 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edb21d30-fa05-490b-b1b1-cb0f73b8e149-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edb21d30-fa05-490b-b1b1-cb0f73b8e149" (UID: "edb21d30-fa05-490b-b1b1-cb0f73b8e149"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.312591 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb21d30-fa05-490b-b1b1-cb0f73b8e149-kube-api-access-bpg2z" (OuterVolumeSpecName: "kube-api-access-bpg2z") pod "edb21d30-fa05-490b-b1b1-cb0f73b8e149" (UID: "edb21d30-fa05-490b-b1b1-cb0f73b8e149"). InnerVolumeSpecName "kube-api-access-bpg2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.394856 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.403192 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.411645 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edb21d30-fa05-490b-b1b1-cb0f73b8e149-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.411671 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpg2z\" (UniqueName: \"kubernetes.io/projected/edb21d30-fa05-490b-b1b1-cb0f73b8e149-kube-api-access-bpg2z\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.513254 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mffp6\" (UniqueName: \"kubernetes.io/projected/37129b31-a43d-412f-878e-72729d852124-kube-api-access-mffp6\") pod \"37129b31-a43d-412f-878e-72729d852124\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.513302 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvg8k\" (UniqueName: \"kubernetes.io/projected/2def1143-79d4-48ec-bae4-f0b620c8bc73-kube-api-access-qvg8k\") pod \"2def1143-79d4-48ec-bae4-f0b620c8bc73\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.513574 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37129b31-a43d-412f-878e-72729d852124-operator-scripts\") pod \"37129b31-a43d-412f-878e-72729d852124\" (UID: \"37129b31-a43d-412f-878e-72729d852124\") " Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.513615 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2def1143-79d4-48ec-bae4-f0b620c8bc73-operator-scripts\") pod \"2def1143-79d4-48ec-bae4-f0b620c8bc73\" (UID: \"2def1143-79d4-48ec-bae4-f0b620c8bc73\") " Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.514129 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37129b31-a43d-412f-878e-72729d852124-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37129b31-a43d-412f-878e-72729d852124" (UID: "37129b31-a43d-412f-878e-72729d852124"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.515298 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2def1143-79d4-48ec-bae4-f0b620c8bc73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2def1143-79d4-48ec-bae4-f0b620c8bc73" (UID: "2def1143-79d4-48ec-bae4-f0b620c8bc73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.517987 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37129b31-a43d-412f-878e-72729d852124-kube-api-access-mffp6" (OuterVolumeSpecName: "kube-api-access-mffp6") pod "37129b31-a43d-412f-878e-72729d852124" (UID: "37129b31-a43d-412f-878e-72729d852124"). InnerVolumeSpecName "kube-api-access-mffp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.518696 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2def1143-79d4-48ec-bae4-f0b620c8bc73-kube-api-access-qvg8k" (OuterVolumeSpecName: "kube-api-access-qvg8k") pod "2def1143-79d4-48ec-bae4-f0b620c8bc73" (UID: "2def1143-79d4-48ec-bae4-f0b620c8bc73"). InnerVolumeSpecName "kube-api-access-qvg8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.546221 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pn5jk" event={"ID":"37129b31-a43d-412f-878e-72729d852124","Type":"ContainerDied","Data":"809f5b9c2a0a56776ad23e9d76a81b57491745f16285bbc694e329c4f3a60555"} Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.546301 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="809f5b9c2a0a56776ad23e9d76a81b57491745f16285bbc694e329c4f3a60555" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.546321 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pn5jk" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.548521 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" event={"ID":"9e329289-66fc-459b-8360-f0e8df0edcc9","Type":"ContainerStarted","Data":"edf6bcae04123e30ad5daccfc24ec9e527589649433100b25317cc7e703fe4ea"} Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.549670 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.553006 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0ef3-account-create-update-lqc7s" event={"ID":"edb21d30-fa05-490b-b1b1-cb0f73b8e149","Type":"ContainerDied","Data":"37744d4eb955ba3023bc5b03bde511006dfd26d59ec64e9b2ac4a8e600079d53"} Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.553032 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0ef3-account-create-update-lqc7s" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.553079 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37744d4eb955ba3023bc5b03bde511006dfd26d59ec64e9b2ac4a8e600079d53" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.559057 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mx79w" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.559555 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mx79w" event={"ID":"2def1143-79d4-48ec-bae4-f0b620c8bc73","Type":"ContainerDied","Data":"baddb997008eb4df652e07fde333a5ff07030255d5a5dad5291b11ec2ad9cd30"} Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.559613 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baddb997008eb4df652e07fde333a5ff07030255d5a5dad5291b11ec2ad9cd30" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.569808 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" podStartSLOduration=4.569792991 podStartE2EDuration="4.569792991s" podCreationTimestamp="2026-01-05 14:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:23.564336595 +0000 UTC m=+1152.871245184" watchObservedRunningTime="2026-01-05 14:08:23.569792991 +0000 UTC m=+1152.876701570" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.616977 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37129b31-a43d-412f-878e-72729d852124-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.617018 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2def1143-79d4-48ec-bae4-f0b620c8bc73-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.617027 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mffp6\" (UniqueName: \"kubernetes.io/projected/37129b31-a43d-412f-878e-72729d852124-kube-api-access-mffp6\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:23 crc kubenswrapper[4740]: I0105 14:08:23.617039 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvg8k\" (UniqueName: \"kubernetes.io/projected/2def1143-79d4-48ec-bae4-f0b620c8bc73-kube-api-access-qvg8k\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.487656 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mljnv" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.558686 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.568891 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.589736 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r8fh\" (UniqueName: \"kubernetes.io/projected/ac07b17f-06ac-42c1-8e1f-49f2101f4774-kube-api-access-7r8fh\") pod \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.589869 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac07b17f-06ac-42c1-8e1f-49f2101f4774-operator-scripts\") pod \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\" (UID: \"ac07b17f-06ac-42c1-8e1f-49f2101f4774\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.594447 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.597864 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac07b17f-06ac-42c1-8e1f-49f2101f4774-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac07b17f-06ac-42c1-8e1f-49f2101f4774" (UID: "ac07b17f-06ac-42c1-8e1f-49f2101f4774"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.609136 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.609621 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9k7k2" event={"ID":"90dd4a82-d074-47a2-a853-d7376f242c35","Type":"ContainerDied","Data":"accb98a9203a7f05ce2e79609bfe93cb10cff9b620b78cdf6dfdde3d68281558"} Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.609662 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="accb98a9203a7f05ce2e79609bfe93cb10cff9b620b78cdf6dfdde3d68281558" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.609617 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9k7k2" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.627147 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac07b17f-06ac-42c1-8e1f-49f2101f4774-kube-api-access-7r8fh" (OuterVolumeSpecName: "kube-api-access-7r8fh") pod "ac07b17f-06ac-42c1-8e1f-49f2101f4774" (UID: "ac07b17f-06ac-42c1-8e1f-49f2101f4774"). InnerVolumeSpecName "kube-api-access-7r8fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.639002 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mljnv" event={"ID":"ac07b17f-06ac-42c1-8e1f-49f2101f4774","Type":"ContainerDied","Data":"ffbc302e65050485098b6df4badb942a64925752e953fdfeab6f66a4f683468a"} Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.639057 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffbc302e65050485098b6df4badb942a64925752e953fdfeab6f66a4f683468a" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.639129 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mljnv" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.650510 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0cc4-account-create-update-k9mmk" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.650521 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0cc4-account-create-update-k9mmk" event={"ID":"a1785e28-d1f5-4985-b080-4496c9b43c6e","Type":"ContainerDied","Data":"faf96eff4fa361af96c721f2486b7b6285d8230f5fc9f643c0b4e2737e8a1695"} Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.650576 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf96eff4fa361af96c721f2486b7b6285d8230f5fc9f643c0b4e2737e8a1695" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.653132 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0fc-account-create-update-tc228" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.653135 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d0fc-account-create-update-tc228" event={"ID":"e383395f-47a3-4209-9cdb-102a606d9ce2","Type":"ContainerDied","Data":"b585472a5a6f0abd9835d82ac668288cc90c0933eef962443eb68205cf8e01f2"} Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.653180 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b585472a5a6f0abd9835d82ac668288cc90c0933eef962443eb68205cf8e01f2" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.656717 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e6c0-account-create-update-tr7lk" event={"ID":"ef9f4db3-55ef-4b94-8044-d317f3b1b760","Type":"ContainerDied","Data":"2a3b432078db95eda3d799e7f8cd5e7c25d8466db60a93e248c126cdeec46bb7"} Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.656762 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a3b432078db95eda3d799e7f8cd5e7c25d8466db60a93e248c126cdeec46bb7" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.656817 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e6c0-account-create-update-tr7lk" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.692439 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef9f4db3-55ef-4b94-8044-d317f3b1b760-operator-scripts\") pod \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.692486 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmr7w\" (UniqueName: \"kubernetes.io/projected/ef9f4db3-55ef-4b94-8044-d317f3b1b760-kube-api-access-mmr7w\") pod \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\" (UID: \"ef9f4db3-55ef-4b94-8044-d317f3b1b760\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.692545 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90dd4a82-d074-47a2-a853-d7376f242c35-operator-scripts\") pod \"90dd4a82-d074-47a2-a853-d7376f242c35\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.693122 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9f4db3-55ef-4b94-8044-d317f3b1b760-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef9f4db3-55ef-4b94-8044-d317f3b1b760" (UID: "ef9f4db3-55ef-4b94-8044-d317f3b1b760"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.694183 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90dd4a82-d074-47a2-a853-d7376f242c35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90dd4a82-d074-47a2-a853-d7376f242c35" (UID: "90dd4a82-d074-47a2-a853-d7376f242c35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.694882 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1785e28-d1f5-4985-b080-4496c9b43c6e-operator-scripts\") pod \"a1785e28-d1f5-4985-b080-4496c9b43c6e\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.694968 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfpkp\" (UniqueName: \"kubernetes.io/projected/90dd4a82-d074-47a2-a853-d7376f242c35-kube-api-access-pfpkp\") pod \"90dd4a82-d074-47a2-a853-d7376f242c35\" (UID: \"90dd4a82-d074-47a2-a853-d7376f242c35\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.694998 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mr8\" (UniqueName: \"kubernetes.io/projected/a1785e28-d1f5-4985-b080-4496c9b43c6e-kube-api-access-m7mr8\") pod \"a1785e28-d1f5-4985-b080-4496c9b43c6e\" (UID: \"a1785e28-d1f5-4985-b080-4496c9b43c6e\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.696621 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef9f4db3-55ef-4b94-8044-d317f3b1b760-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.696648 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90dd4a82-d074-47a2-a853-d7376f242c35-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.696661 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r8fh\" (UniqueName: \"kubernetes.io/projected/ac07b17f-06ac-42c1-8e1f-49f2101f4774-kube-api-access-7r8fh\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.696672 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac07b17f-06ac-42c1-8e1f-49f2101f4774-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.698249 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1785e28-d1f5-4985-b080-4496c9b43c6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1785e28-d1f5-4985-b080-4496c9b43c6e" (UID: "a1785e28-d1f5-4985-b080-4496c9b43c6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.703919 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9f4db3-55ef-4b94-8044-d317f3b1b760-kube-api-access-mmr7w" (OuterVolumeSpecName: "kube-api-access-mmr7w") pod "ef9f4db3-55ef-4b94-8044-d317f3b1b760" (UID: "ef9f4db3-55ef-4b94-8044-d317f3b1b760"). InnerVolumeSpecName "kube-api-access-mmr7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.717715 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1785e28-d1f5-4985-b080-4496c9b43c6e-kube-api-access-m7mr8" (OuterVolumeSpecName: "kube-api-access-m7mr8") pod "a1785e28-d1f5-4985-b080-4496c9b43c6e" (UID: "a1785e28-d1f5-4985-b080-4496c9b43c6e"). InnerVolumeSpecName "kube-api-access-m7mr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.720039 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90dd4a82-d074-47a2-a853-d7376f242c35-kube-api-access-pfpkp" (OuterVolumeSpecName: "kube-api-access-pfpkp") pod "90dd4a82-d074-47a2-a853-d7376f242c35" (UID: "90dd4a82-d074-47a2-a853-d7376f242c35"). InnerVolumeSpecName "kube-api-access-pfpkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: E0105 14:08:26.749083 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac07b17f_06ac_42c1_8e1f_49f2101f4774.slice\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.797632 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49xv\" (UniqueName: \"kubernetes.io/projected/e383395f-47a3-4209-9cdb-102a606d9ce2-kube-api-access-k49xv\") pod \"e383395f-47a3-4209-9cdb-102a606d9ce2\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.797700 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e383395f-47a3-4209-9cdb-102a606d9ce2-operator-scripts\") pod \"e383395f-47a3-4209-9cdb-102a606d9ce2\" (UID: \"e383395f-47a3-4209-9cdb-102a606d9ce2\") " Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.798343 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mr8\" (UniqueName: \"kubernetes.io/projected/a1785e28-d1f5-4985-b080-4496c9b43c6e-kube-api-access-m7mr8\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.798362 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfpkp\" (UniqueName: \"kubernetes.io/projected/90dd4a82-d074-47a2-a853-d7376f242c35-kube-api-access-pfpkp\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.798374 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmr7w\" (UniqueName: \"kubernetes.io/projected/ef9f4db3-55ef-4b94-8044-d317f3b1b760-kube-api-access-mmr7w\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.798384 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1785e28-d1f5-4985-b080-4496c9b43c6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.799187 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e383395f-47a3-4209-9cdb-102a606d9ce2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e383395f-47a3-4209-9cdb-102a606d9ce2" (UID: "e383395f-47a3-4209-9cdb-102a606d9ce2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.802305 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e383395f-47a3-4209-9cdb-102a606d9ce2-kube-api-access-k49xv" (OuterVolumeSpecName: "kube-api-access-k49xv") pod "e383395f-47a3-4209-9cdb-102a606d9ce2" (UID: "e383395f-47a3-4209-9cdb-102a606d9ce2"). InnerVolumeSpecName "kube-api-access-k49xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.899813 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49xv\" (UniqueName: \"kubernetes.io/projected/e383395f-47a3-4209-9cdb-102a606d9ce2-kube-api-access-k49xv\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:26 crc kubenswrapper[4740]: I0105 14:08:26.899847 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e383395f-47a3-4209-9cdb-102a606d9ce2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:27 crc kubenswrapper[4740]: I0105 14:08:27.702219 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kzngh" event={"ID":"6aa5ad76-5e26-4c24-8b3a-50ff2a182523","Type":"ContainerStarted","Data":"d4b65a82e39cd7c53f38459a66e71ae3e75f1a310110963ebacd44e27b5fd2a9"} Jan 05 14:08:27 crc kubenswrapper[4740]: I0105 14:08:27.752503 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kzngh" podStartSLOduration=3.237718013 podStartE2EDuration="8.752483259s" podCreationTimestamp="2026-01-05 14:08:19 +0000 UTC" firstStartedPulling="2026-01-05 14:08:20.820593967 +0000 UTC m=+1150.127502546" lastFinishedPulling="2026-01-05 14:08:26.335359193 +0000 UTC m=+1155.642267792" observedRunningTime="2026-01-05 14:08:27.743857288 +0000 UTC m=+1157.050765867" watchObservedRunningTime="2026-01-05 14:08:27.752483259 +0000 UTC m=+1157.059391838" Jan 05 14:08:29 crc kubenswrapper[4740]: E0105 14:08:29.434994 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.411307 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.503814 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nbmdk"] Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.504089 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerName="dnsmasq-dns" containerID="cri-o://c87abca0d72793970bea77bdca8ffb24f0f03ec87bb5bad620c09fc498aca5fd" gracePeriod=10 Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.751167 4740 generic.go:334] "Generic (PLEG): container finished" podID="6aa5ad76-5e26-4c24-8b3a-50ff2a182523" containerID="d4b65a82e39cd7c53f38459a66e71ae3e75f1a310110963ebacd44e27b5fd2a9" exitCode=0 Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.751325 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kzngh" event={"ID":"6aa5ad76-5e26-4c24-8b3a-50ff2a182523","Type":"ContainerDied","Data":"d4b65a82e39cd7c53f38459a66e71ae3e75f1a310110963ebacd44e27b5fd2a9"} Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.754387 4740 generic.go:334] "Generic (PLEG): container finished" podID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerID="c87abca0d72793970bea77bdca8ffb24f0f03ec87bb5bad620c09fc498aca5fd" exitCode=0 Jan 05 14:08:30 crc kubenswrapper[4740]: I0105 14:08:30.754419 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" event={"ID":"6fe29f6d-955f-4f3f-b62f-634812236d3e","Type":"ContainerDied","Data":"c87abca0d72793970bea77bdca8ffb24f0f03ec87bb5bad620c09fc498aca5fd"} Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.001465 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.109902 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-nb\") pod \"6fe29f6d-955f-4f3f-b62f-634812236d3e\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.110353 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnwtq\" (UniqueName: \"kubernetes.io/projected/6fe29f6d-955f-4f3f-b62f-634812236d3e-kube-api-access-tnwtq\") pod \"6fe29f6d-955f-4f3f-b62f-634812236d3e\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.110447 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-sb\") pod \"6fe29f6d-955f-4f3f-b62f-634812236d3e\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.110472 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-config\") pod \"6fe29f6d-955f-4f3f-b62f-634812236d3e\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.110577 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-dns-svc\") pod \"6fe29f6d-955f-4f3f-b62f-634812236d3e\" (UID: \"6fe29f6d-955f-4f3f-b62f-634812236d3e\") " Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.124752 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe29f6d-955f-4f3f-b62f-634812236d3e-kube-api-access-tnwtq" (OuterVolumeSpecName: "kube-api-access-tnwtq") pod "6fe29f6d-955f-4f3f-b62f-634812236d3e" (UID: "6fe29f6d-955f-4f3f-b62f-634812236d3e"). InnerVolumeSpecName "kube-api-access-tnwtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.169972 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fe29f6d-955f-4f3f-b62f-634812236d3e" (UID: "6fe29f6d-955f-4f3f-b62f-634812236d3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.170955 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fe29f6d-955f-4f3f-b62f-634812236d3e" (UID: "6fe29f6d-955f-4f3f-b62f-634812236d3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.179395 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fe29f6d-955f-4f3f-b62f-634812236d3e" (UID: "6fe29f6d-955f-4f3f-b62f-634812236d3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.179670 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-config" (OuterVolumeSpecName: "config") pod "6fe29f6d-955f-4f3f-b62f-634812236d3e" (UID: "6fe29f6d-955f-4f3f-b62f-634812236d3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.213717 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.213764 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnwtq\" (UniqueName: \"kubernetes.io/projected/6fe29f6d-955f-4f3f-b62f-634812236d3e-kube-api-access-tnwtq\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.213778 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.213803 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.213813 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe29f6d-955f-4f3f-b62f-634812236d3e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.768389 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" event={"ID":"6fe29f6d-955f-4f3f-b62f-634812236d3e","Type":"ContainerDied","Data":"3b6ac88baff9332edc930d2589593a0906ddb1e0ff08811a2172e72d15216f6c"} Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.768475 4740 scope.go:117] "RemoveContainer" containerID="c87abca0d72793970bea77bdca8ffb24f0f03ec87bb5bad620c09fc498aca5fd" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.768408 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nbmdk" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.795565 4740 scope.go:117] "RemoveContainer" containerID="81339fa256d495423f7e573ef3c248ecb52a7346c675a5aab6bc5ce2f5eabc65" Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.848916 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nbmdk"] Jan 05 14:08:31 crc kubenswrapper[4740]: I0105 14:08:31.860461 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nbmdk"] Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.235226 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.365173 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-combined-ca-bundle\") pod \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.365325 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-config-data\") pod \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.365487 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6svvs\" (UniqueName: \"kubernetes.io/projected/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-kube-api-access-6svvs\") pod \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\" (UID: \"6aa5ad76-5e26-4c24-8b3a-50ff2a182523\") " Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.371498 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-kube-api-access-6svvs" (OuterVolumeSpecName: "kube-api-access-6svvs") pod "6aa5ad76-5e26-4c24-8b3a-50ff2a182523" (UID: "6aa5ad76-5e26-4c24-8b3a-50ff2a182523"). InnerVolumeSpecName "kube-api-access-6svvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.420531 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aa5ad76-5e26-4c24-8b3a-50ff2a182523" (UID: "6aa5ad76-5e26-4c24-8b3a-50ff2a182523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.439332 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-config-data" (OuterVolumeSpecName: "config-data") pod "6aa5ad76-5e26-4c24-8b3a-50ff2a182523" (UID: "6aa5ad76-5e26-4c24-8b3a-50ff2a182523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.468679 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.468731 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.468755 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6svvs\" (UniqueName: \"kubernetes.io/projected/6aa5ad76-5e26-4c24-8b3a-50ff2a182523-kube-api-access-6svvs\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.782678 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kzngh" event={"ID":"6aa5ad76-5e26-4c24-8b3a-50ff2a182523","Type":"ContainerDied","Data":"d28624c9885e4ff39590eb796572699250a5392af767796d977f339815ab98d5"} Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.783014 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d28624c9885e4ff39590eb796572699250a5392af767796d977f339815ab98d5" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.782795 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kzngh" Jan 05 14:08:32 crc kubenswrapper[4740]: I0105 14:08:32.984774 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" path="/var/lib/kubelet/pods/6fe29f6d-955f-4f3f-b62f-634812236d3e/volumes" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.050521 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-fwgdv"] Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.050938 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37129b31-a43d-412f-878e-72729d852124" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.050959 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="37129b31-a43d-412f-878e-72729d852124" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.050973 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerName="init" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.050979 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerName="init" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.050997 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerName="init" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051003 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerName="init" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051018 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd4a82-d074-47a2-a853-d7376f242c35" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051024 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd4a82-d074-47a2-a853-d7376f242c35" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051036 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac07b17f-06ac-42c1-8e1f-49f2101f4774" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051042 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac07b17f-06ac-42c1-8e1f-49f2101f4774" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051049 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerName="dnsmasq-dns" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051057 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerName="dnsmasq-dns" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051075 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2def1143-79d4-48ec-bae4-f0b620c8bc73" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051095 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2def1143-79d4-48ec-bae4-f0b620c8bc73" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051102 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb21d30-fa05-490b-b1b1-cb0f73b8e149" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051107 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb21d30-fa05-490b-b1b1-cb0f73b8e149" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051116 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1785e28-d1f5-4985-b080-4496c9b43c6e" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051122 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1785e28-d1f5-4985-b080-4496c9b43c6e" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051135 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerName="dnsmasq-dns" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051141 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerName="dnsmasq-dns" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051149 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e383395f-47a3-4209-9cdb-102a606d9ce2" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051154 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e383395f-47a3-4209-9cdb-102a606d9ce2" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051167 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9f4db3-55ef-4b94-8044-d317f3b1b760" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051174 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9f4db3-55ef-4b94-8044-d317f3b1b760" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: E0105 14:08:33.051182 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa5ad76-5e26-4c24-8b3a-50ff2a182523" containerName="keystone-db-sync" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051187 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa5ad76-5e26-4c24-8b3a-50ff2a182523" containerName="keystone-db-sync" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051364 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9f4db3-55ef-4b94-8044-d317f3b1b760" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051378 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe29f6d-955f-4f3f-b62f-634812236d3e" containerName="dnsmasq-dns" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051389 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bfc133-670f-4697-b3f3-f7ae2fdd52ae" containerName="dnsmasq-dns" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051397 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2def1143-79d4-48ec-bae4-f0b620c8bc73" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051411 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="37129b31-a43d-412f-878e-72729d852124" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051417 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb21d30-fa05-490b-b1b1-cb0f73b8e149" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051426 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd4a82-d074-47a2-a853-d7376f242c35" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051438 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac07b17f-06ac-42c1-8e1f-49f2101f4774" containerName="mariadb-database-create" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051449 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa5ad76-5e26-4c24-8b3a-50ff2a182523" containerName="keystone-db-sync" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051457 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1785e28-d1f5-4985-b080-4496c9b43c6e" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.051471 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e383395f-47a3-4209-9cdb-102a606d9ce2" containerName="mariadb-account-create-update" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.057609 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.070301 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-fwgdv"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.087214 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-h6jzd"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.088591 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.093167 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.093321 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.093449 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wktmq" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.093558 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.093688 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.108771 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h6jzd"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.203179 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-979vn"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.204647 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.207230 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.207463 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vhwxd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.212773 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-979vn"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.225725 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226085 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226113 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-fernet-keys\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226202 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmjk8\" (UniqueName: \"kubernetes.io/projected/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-kube-api-access-vmjk8\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226250 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-config\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226271 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-scripts\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226291 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-credential-keys\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226318 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-combined-ca-bundle\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226366 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-config-data\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226401 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.226416 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht85\" (UniqueName: \"kubernetes.io/projected/7c79ced0-0772-4ae5-93e6-3410088d5d65-kube-api-access-mht85\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328386 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-combined-ca-bundle\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328457 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-config-data\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328492 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328514 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht85\" (UniqueName: \"kubernetes.io/projected/7c79ced0-0772-4ae5-93e6-3410088d5d65-kube-api-access-mht85\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328565 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-config-data\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328635 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328658 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328679 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328707 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-combined-ca-bundle\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328729 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tm2w\" (UniqueName: \"kubernetes.io/projected/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-kube-api-access-2tm2w\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328751 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-fernet-keys\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmjk8\" (UniqueName: \"kubernetes.io/projected/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-kube-api-access-vmjk8\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328823 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-config\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328843 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-scripts\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.328864 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-credential-keys\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.329372 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.329939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.332550 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.334579 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-config\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.335025 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-scripts\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.335684 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-credential-keys\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.336710 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-combined-ca-bundle\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.339449 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-fernet-keys\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.339974 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-config-data\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.341294 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.361413 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rkvws"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.362687 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.368404 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmjk8\" (UniqueName: \"kubernetes.io/projected/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-kube-api-access-vmjk8\") pod \"dnsmasq-dns-bbf5cc879-fwgdv\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.385547 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.386117 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.390055 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rkvws"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.419077 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qgz2z" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.420179 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.429466 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht85\" (UniqueName: \"kubernetes.io/projected/7c79ced0-0772-4ae5-93e6-3410088d5d65-kube-api-access-mht85\") pod \"keystone-bootstrap-h6jzd\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.436548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-combined-ca-bundle\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.436687 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-config-data\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.436723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhq7\" (UniqueName: \"kubernetes.io/projected/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-kube-api-access-ffhq7\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.436746 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-config\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.436802 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-combined-ca-bundle\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.436824 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tm2w\" (UniqueName: \"kubernetes.io/projected/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-kube-api-access-2tm2w\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.440131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-config-data\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.448981 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-combined-ca-bundle\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.487482 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tm2w\" (UniqueName: \"kubernetes.io/projected/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-kube-api-access-2tm2w\") pod \"heat-db-sync-979vn\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.491557 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lzmfj"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.492833 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.494333 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.494646 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gjqkx" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.499440 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.543409 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhq7\" (UniqueName: \"kubernetes.io/projected/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-kube-api-access-ffhq7\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.543459 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-config\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.543542 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-combined-ca-bundle\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.544200 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-979vn" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.551763 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-b2trl"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.553132 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.558392 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mgm5g" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.558409 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-combined-ca-bundle\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.558586 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.558844 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-config\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.572220 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhq7\" (UniqueName: \"kubernetes.io/projected/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-kube-api-access-ffhq7\") pod \"neutron-db-sync-rkvws\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.582606 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lzmfj"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.606146 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b2trl"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.611905 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rkvws" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650515 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-combined-ca-bundle\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650572 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-db-sync-config-data\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650592 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-scripts\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650648 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-config-data\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650871 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rz4\" (UniqueName: \"kubernetes.io/projected/333dfe82-8fdd-400e-8b8c-89906d81e778-kube-api-access-54rz4\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650896 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-combined-ca-bundle\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.650964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/333dfe82-8fdd-400e-8b8c-89906d81e778-etc-machine-id\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.651000 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-db-sync-config-data\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.651028 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlpvt\" (UniqueName: \"kubernetes.io/projected/401ab705-cb09-4760-840d-2b99b6c9148c-kube-api-access-qlpvt\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.666483 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-kghbd"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.667825 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.685550 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.685745 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8bmxt" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.686239 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.718893 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.727308 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kghbd"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.765153 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlpvt\" (UniqueName: \"kubernetes.io/projected/401ab705-cb09-4760-840d-2b99b6c9148c-kube-api-access-qlpvt\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.768547 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-fwgdv"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.768726 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-combined-ca-bundle\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.768814 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-scripts\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.768856 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-db-sync-config-data\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.768958 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-config-data\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.769022 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rz4\" (UniqueName: \"kubernetes.io/projected/333dfe82-8fdd-400e-8b8c-89906d81e778-kube-api-access-54rz4\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.769103 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-combined-ca-bundle\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.769329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/333dfe82-8fdd-400e-8b8c-89906d81e778-etc-machine-id\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.769419 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-db-sync-config-data\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.781970 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/333dfe82-8fdd-400e-8b8c-89906d81e778-etc-machine-id\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.786411 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-combined-ca-bundle\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.802668 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-config-data\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.805800 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlpvt\" (UniqueName: \"kubernetes.io/projected/401ab705-cb09-4760-840d-2b99b6c9148c-kube-api-access-qlpvt\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.807278 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-db-sync-config-data\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.812132 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-scripts\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.815695 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-combined-ca-bundle\") pod \"barbican-db-sync-b2trl\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.816707 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rz4\" (UniqueName: \"kubernetes.io/projected/333dfe82-8fdd-400e-8b8c-89906d81e778-kube-api-access-54rz4\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.827853 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-db-sync-config-data\") pod \"cinder-db-sync-lzmfj\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.862264 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-glxvc"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.895019 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.896583 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-config-data\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.896623 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5kv\" (UniqueName: \"kubernetes.io/projected/9b7a41a0-75e3-4638-9856-90adffd28751-kube-api-access-vv5kv\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.896702 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7a41a0-75e3-4638-9856-90adffd28751-logs\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.896798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-scripts\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.896823 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-combined-ca-bundle\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.899974 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-glxvc"] Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.905802 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b2trl" Jan 05 14:08:33 crc kubenswrapper[4740]: I0105 14:08:33.997335 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.001575 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.005958 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.006204 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.006402 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-config\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.006452 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgc2\" (UniqueName: \"kubernetes.io/projected/425802c2-2528-451d-960a-24372126b18c-kube-api-access-6hgc2\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.006504 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.008773 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.008868 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-scripts\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.008895 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.008921 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-log-httpd\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.008957 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-scripts\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.008981 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.009029 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-config-data\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.009049 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-combined-ca-bundle\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034317 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034406 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh69w\" (UniqueName: \"kubernetes.io/projected/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-kube-api-access-nh69w\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034477 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-config-data\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034519 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5kv\" (UniqueName: \"kubernetes.io/projected/9b7a41a0-75e3-4638-9856-90adffd28751-kube-api-access-vv5kv\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034599 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-run-httpd\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034638 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.034709 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7a41a0-75e3-4638-9856-90adffd28751-logs\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.036373 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7a41a0-75e3-4638-9856-90adffd28751-logs\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.046286 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-combined-ca-bundle\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.046753 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-scripts\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.056264 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-config-data\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.066588 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.078665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5kv\" (UniqueName: \"kubernetes.io/projected/9b7a41a0-75e3-4638-9856-90adffd28751-kube-api-access-vv5kv\") pod \"placement-db-sync-kghbd\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.082285 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.138647 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-config\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.138688 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgc2\" (UniqueName: \"kubernetes.io/projected/425802c2-2528-451d-960a-24372126b18c-kube-api-access-6hgc2\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.141865 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.141918 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.141982 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-scripts\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.142008 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.142029 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-log-httpd\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.142094 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.142568 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-log-httpd\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.142858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.143712 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.144091 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-config-data\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.144168 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.144205 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh69w\" (UniqueName: \"kubernetes.io/projected/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-kube-api-access-nh69w\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.144327 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-run-httpd\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.144360 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.146363 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.148006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-run-httpd\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.149941 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-config\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.155552 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.158722 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.166316 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgc2\" (UniqueName: \"kubernetes.io/projected/425802c2-2528-451d-960a-24372126b18c-kube-api-access-6hgc2\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.174267 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.174315 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-scripts\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.182262 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh69w\" (UniqueName: \"kubernetes.io/projected/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-kube-api-access-nh69w\") pod \"dnsmasq-dns-56df8fb6b7-glxvc\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.184902 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-config-data\") pod \"ceilometer-0\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.249129 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.251242 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.258756 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.263498 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.263548 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.263510 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.263739 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kw6zt" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.269530 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kghbd" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.288062 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359296 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359336 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359355 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-logs\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359395 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvqf\" (UniqueName: \"kubernetes.io/projected/319db572-e2af-4b65-95b2-4ebd62d38c4a-kube-api-access-zxvqf\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359431 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359455 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359472 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.359537 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.367209 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.372615 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.390413 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.408581 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.408783 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.414973 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462698 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462727 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462745 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-logs\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462789 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvqf\" (UniqueName: \"kubernetes.io/projected/319db572-e2af-4b65-95b2-4ebd62d38c4a-kube-api-access-zxvqf\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462833 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462863 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.462885 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.463335 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-logs\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.463582 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.469906 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.471396 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.472432 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.473929 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.491165 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvqf\" (UniqueName: \"kubernetes.io/projected/319db572-e2af-4b65-95b2-4ebd62d38c4a-kube-api-access-zxvqf\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.499239 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.499281 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f181c6c175c062630ff3c011c7f7b401c107aabb3ab99d5fa4b9467f0f8d5d6/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.527689 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-fwgdv"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565184 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgctl\" (UniqueName: \"kubernetes.io/projected/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-kube-api-access-dgctl\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565286 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565309 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565355 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565400 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565455 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565491 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.565525 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.624365 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686342 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686424 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686561 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686635 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686815 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgctl\" (UniqueName: \"kubernetes.io/projected/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-kube-api-access-dgctl\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.686991 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.687019 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.687697 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.688560 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.689927 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.689969 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f94b1d52da0ffd2f3c454948afdab6168cc98b014f61ee88f5256307d60f00b0/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.691822 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.694672 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.694859 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.696161 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.723779 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgctl\" (UniqueName: \"kubernetes.io/projected/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-kube-api-access-dgctl\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.798768 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: W0105 14:08:34.894272 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5f3fd5_cc0b_42c1_8cee_b28452adde1d.slice/crio-3767ec1b5e43de8baff2bb383872df7fab367ac0f5f9a038f7151c61b39d0242 WatchSource:0}: Error finding container 3767ec1b5e43de8baff2bb383872df7fab367ac0f5f9a038f7151c61b39d0242: Status 404 returned error can't find the container with id 3767ec1b5e43de8baff2bb383872df7fab367ac0f5f9a038f7151c61b39d0242 Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.899667 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-979vn"] Jan 05 14:08:34 crc kubenswrapper[4740]: W0105 14:08:34.902815 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60a4f2a8_8fa7_44b6_a6bc_15531c720f24.slice/crio-2c492cbab45df40b9522ccf66c76de22e064c2f5d37393c874a61bda024969ee WatchSource:0}: Error finding container 2c492cbab45df40b9522ccf66c76de22e064c2f5d37393c874a61bda024969ee: Status 404 returned error can't find the container with id 2c492cbab45df40b9522ccf66c76de22e064c2f5d37393c874a61bda024969ee Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.904650 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.921275 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rkvws"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.930693 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h6jzd"] Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.953853 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" event={"ID":"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0","Type":"ContainerStarted","Data":"fa29ed9ae52e58c545a63a8ad1a5e5e127e85eae4609cd0ced00f23e8057145d"} Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.958223 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-979vn" event={"ID":"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d","Type":"ContainerStarted","Data":"3767ec1b5e43de8baff2bb383872df7fab367ac0f5f9a038f7151c61b39d0242"} Jan 05 14:08:34 crc kubenswrapper[4740]: I0105 14:08:34.961638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rkvws" event={"ID":"60a4f2a8-8fa7-44b6-a6bc-15531c720f24","Type":"ContainerStarted","Data":"2c492cbab45df40b9522ccf66c76de22e064c2f5d37393c874a61bda024969ee"} Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.018704 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.409054 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kghbd"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.456132 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-glxvc"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.540630 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.555914 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lzmfj"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.592987 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.623129 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b2trl"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.652188 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.902396 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:35 crc kubenswrapper[4740]: I0105 14:08:35.991747 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.018050 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kghbd" event={"ID":"9b7a41a0-75e3-4638-9856-90adffd28751","Type":"ContainerStarted","Data":"5d23bb36f7939423b498cd785d8f02617b255e40d832f0ecc239c44210a4ba77"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.022206 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lzmfj" event={"ID":"333dfe82-8fdd-400e-8b8c-89906d81e778","Type":"ContainerStarted","Data":"7448cdc041c0f54f9733f2ab65e6f420cae4e64f9d5114acc836aad83277d3ec"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.027741 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"319db572-e2af-4b65-95b2-4ebd62d38c4a","Type":"ContainerStarted","Data":"7c40dbd6d5e57917d023abcc046cee4129153c4bd62b0b8d3ecb91087a96efda"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.030294 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" event={"ID":"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04","Type":"ContainerStarted","Data":"f5e973165c2340b975171abe1645edbbdefed2ba25d6fb4fed6763df346a7a62"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.032141 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b2trl" event={"ID":"401ab705-cb09-4760-840d-2b99b6c9148c","Type":"ContainerStarted","Data":"a45320334ab3d9d48b06605805aaad6f7dfa1145f29060a0b7bf6720736382fa"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.034220 4740 generic.go:334] "Generic (PLEG): container finished" podID="a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" containerID="2069b30cf7ac8e40ca77f25397812d5c9f9ba9b40fa767e49b1089a55cea7d3a" exitCode=0 Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.034372 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" event={"ID":"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0","Type":"ContainerDied","Data":"2069b30cf7ac8e40ca77f25397812d5c9f9ba9b40fa767e49b1089a55cea7d3a"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.038227 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerStarted","Data":"529ef03720f0381c621357a611ff36271476c32c5676f8a354e7e2e39feefad7"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.046655 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rkvws" event={"ID":"60a4f2a8-8fa7-44b6-a6bc-15531c720f24","Type":"ContainerStarted","Data":"acbed7293a2f8b258573fd52eb80726a34d8891f6ade563484839fa998b70910"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.072441 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h6jzd" event={"ID":"7c79ced0-0772-4ae5-93e6-3410088d5d65","Type":"ContainerStarted","Data":"a7dea7bce463deeb0849a33a3a74c8beb5e29f29e178831abb5786dd73f8fca7"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.072801 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h6jzd" event={"ID":"7c79ced0-0772-4ae5-93e6-3410088d5d65","Type":"ContainerStarted","Data":"1a79e0c1f882b1c660d00a626e44d5bda0d9052be8b64cd99641c1b67f2423fd"} Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.117768 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rkvws" podStartSLOduration=3.117741643 podStartE2EDuration="3.117741643s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:36.079486028 +0000 UTC m=+1165.386394607" watchObservedRunningTime="2026-01-05 14:08:36.117741643 +0000 UTC m=+1165.424650222" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.171837 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.186865 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-h6jzd" podStartSLOduration=3.186845255 podStartE2EDuration="3.186845255s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:36.108155347 +0000 UTC m=+1165.415063926" watchObservedRunningTime="2026-01-05 14:08:36.186845255 +0000 UTC m=+1165.493753834" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.544592 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.593303 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-svc\") pod \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.593352 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmjk8\" (UniqueName: \"kubernetes.io/projected/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-kube-api-access-vmjk8\") pod \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.593516 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-config\") pod \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.593724 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-swift-storage-0\") pod \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.593896 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-nb\") pod \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.594017 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-sb\") pod \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\" (UID: \"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0\") " Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.604280 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-kube-api-access-vmjk8" (OuterVolumeSpecName: "kube-api-access-vmjk8") pod "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" (UID: "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0"). InnerVolumeSpecName "kube-api-access-vmjk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.652626 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" (UID: "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.653191 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" (UID: "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.700147 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" (UID: "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.701595 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.701628 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmjk8\" (UniqueName: \"kubernetes.io/projected/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-kube-api-access-vmjk8\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.701639 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.714160 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" (UID: "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.724647 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-config" (OuterVolumeSpecName: "config") pod "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" (UID: "a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.807279 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.807488 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:36 crc kubenswrapper[4740]: I0105 14:08:36.807500 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.098718 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"319db572-e2af-4b65-95b2-4ebd62d38c4a","Type":"ContainerStarted","Data":"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2"} Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.102625 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bdc3dd07-42ff-46f5-a41b-b49d590b65b2","Type":"ContainerStarted","Data":"ce2339c290f7a431d794847142ff24b308b71f976e3dfc5749731712870de336"} Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.105550 4740 generic.go:334] "Generic (PLEG): container finished" podID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerID="598942101b1e719c5f3b57d3c30d747213576872bf303f654f185c6fc64fb387" exitCode=0 Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.105601 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" event={"ID":"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04","Type":"ContainerDied","Data":"598942101b1e719c5f3b57d3c30d747213576872bf303f654f185c6fc64fb387"} Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.117126 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.117588 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-fwgdv" event={"ID":"a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0","Type":"ContainerDied","Data":"fa29ed9ae52e58c545a63a8ad1a5e5e127e85eae4609cd0ced00f23e8057145d"} Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.120044 4740 scope.go:117] "RemoveContainer" containerID="2069b30cf7ac8e40ca77f25397812d5c9f9ba9b40fa767e49b1089a55cea7d3a" Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.184693 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-fwgdv"] Jan 05 14:08:37 crc kubenswrapper[4740]: I0105 14:08:37.193922 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-fwgdv"] Jan 05 14:08:37 crc kubenswrapper[4740]: E0105 14:08:37.275633 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.130990 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" event={"ID":"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04","Type":"ContainerStarted","Data":"dcb9b4f14b12e86a1db4aaf0888171fe96abe11fd1336ca0f511d258f29a07e5"} Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.131274 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.133610 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"319db572-e2af-4b65-95b2-4ebd62d38c4a","Type":"ContainerStarted","Data":"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96"} Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.133719 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-log" containerID="cri-o://ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2" gracePeriod=30 Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.133728 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-httpd" containerID="cri-o://4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96" gracePeriod=30 Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.141332 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bdc3dd07-42ff-46f5-a41b-b49d590b65b2","Type":"ContainerStarted","Data":"883663c4dc8a569943b94ba10a5d5c4f465b672a3e2ef491a6c3f8f1c4df59e7"} Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.141360 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bdc3dd07-42ff-46f5-a41b-b49d590b65b2","Type":"ContainerStarted","Data":"988908b547087a37bfdcfad4847887ce39768efc0a873243d2419114e0b56181"} Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.141516 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-httpd" containerID="cri-o://883663c4dc8a569943b94ba10a5d5c4f465b672a3e2ef491a6c3f8f1c4df59e7" gracePeriod=30 Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.141524 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-log" containerID="cri-o://988908b547087a37bfdcfad4847887ce39768efc0a873243d2419114e0b56181" gracePeriod=30 Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.183527 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.183511217 podStartE2EDuration="5.183511217s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:38.181202746 +0000 UTC m=+1167.488111315" watchObservedRunningTime="2026-01-05 14:08:38.183511217 +0000 UTC m=+1167.490419796" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.189507 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" podStartSLOduration=5.189501137 podStartE2EDuration="5.189501137s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:38.159138114 +0000 UTC m=+1167.466046703" watchObservedRunningTime="2026-01-05 14:08:38.189501137 +0000 UTC m=+1167.496409716" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.206040 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.20601717 podStartE2EDuration="5.20601717s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:08:38.197365088 +0000 UTC m=+1167.504273667" watchObservedRunningTime="2026-01-05 14:08:38.20601717 +0000 UTC m=+1167.512925749" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.800520 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979243 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-public-tls-certs\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979344 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-scripts\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979370 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxvqf\" (UniqueName: \"kubernetes.io/projected/319db572-e2af-4b65-95b2-4ebd62d38c4a-kube-api-access-zxvqf\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979557 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-httpd-run\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979784 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979877 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-combined-ca-bundle\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979920 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-logs\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979919 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.979948 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-config-data\") pod \"319db572-e2af-4b65-95b2-4ebd62d38c4a\" (UID: \"319db572-e2af-4b65-95b2-4ebd62d38c4a\") " Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.980129 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-logs" (OuterVolumeSpecName: "logs") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.981601 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.981631 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/319db572-e2af-4b65-95b2-4ebd62d38c4a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.983821 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" path="/var/lib/kubelet/pods/a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0/volumes" Jan 05 14:08:38 crc kubenswrapper[4740]: I0105 14:08:38.987366 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-scripts" (OuterVolumeSpecName: "scripts") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.001464 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319db572-e2af-4b65-95b2-4ebd62d38c4a-kube-api-access-zxvqf" (OuterVolumeSpecName: "kube-api-access-zxvqf") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "kube-api-access-zxvqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.007923 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba" (OuterVolumeSpecName: "glance") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.034390 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.080518 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.086530 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.086586 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.086597 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.086606 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxvqf\" (UniqueName: \"kubernetes.io/projected/319db572-e2af-4b65-95b2-4ebd62d38c4a-kube-api-access-zxvqf\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.086639 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") on node \"crc\" " Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.098297 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-config-data" (OuterVolumeSpecName: "config-data") pod "319db572-e2af-4b65-95b2-4ebd62d38c4a" (UID: "319db572-e2af-4b65-95b2-4ebd62d38c4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.143628 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.143835 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba") on node "crc" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.187645 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.187868 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/319db572-e2af-4b65-95b2-4ebd62d38c4a-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.190837 4740 generic.go:334] "Generic (PLEG): container finished" podID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerID="4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96" exitCode=143 Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.190868 4740 generic.go:334] "Generic (PLEG): container finished" podID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerID="ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2" exitCode=143 Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.190904 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"319db572-e2af-4b65-95b2-4ebd62d38c4a","Type":"ContainerDied","Data":"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96"} Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.190932 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"319db572-e2af-4b65-95b2-4ebd62d38c4a","Type":"ContainerDied","Data":"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2"} Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.190942 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"319db572-e2af-4b65-95b2-4ebd62d38c4a","Type":"ContainerDied","Data":"7c40dbd6d5e57917d023abcc046cee4129153c4bd62b0b8d3ecb91087a96efda"} Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.190957 4740 scope.go:117] "RemoveContainer" containerID="4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.191095 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.198465 4740 generic.go:334] "Generic (PLEG): container finished" podID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerID="883663c4dc8a569943b94ba10a5d5c4f465b672a3e2ef491a6c3f8f1c4df59e7" exitCode=143 Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.198495 4740 generic.go:334] "Generic (PLEG): container finished" podID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerID="988908b547087a37bfdcfad4847887ce39768efc0a873243d2419114e0b56181" exitCode=143 Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.198513 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bdc3dd07-42ff-46f5-a41b-b49d590b65b2","Type":"ContainerDied","Data":"883663c4dc8a569943b94ba10a5d5c4f465b672a3e2ef491a6c3f8f1c4df59e7"} Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.198578 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bdc3dd07-42ff-46f5-a41b-b49d590b65b2","Type":"ContainerDied","Data":"988908b547087a37bfdcfad4847887ce39768efc0a873243d2419114e0b56181"} Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.230642 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.250314 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.263439 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:39 crc kubenswrapper[4740]: E0105 14:08:39.263947 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" containerName="init" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.263967 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" containerName="init" Jan 05 14:08:39 crc kubenswrapper[4740]: E0105 14:08:39.264005 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-log" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.264012 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-log" Jan 05 14:08:39 crc kubenswrapper[4740]: E0105 14:08:39.264023 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-httpd" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.264029 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-httpd" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.264235 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d6c48a-98ad-48e3-8373-e7fc3cc47bc0" containerName="init" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.264256 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-log" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.264269 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" containerName="glance-httpd" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.265410 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.267567 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.268108 4740 scope.go:117] "RemoveContainer" containerID="ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.270019 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.277814 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.318945 4740 scope.go:117] "RemoveContainer" containerID="4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96" Jan 05 14:08:39 crc kubenswrapper[4740]: E0105 14:08:39.319454 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96\": container with ID starting with 4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96 not found: ID does not exist" containerID="4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.319509 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96"} err="failed to get container status \"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96\": rpc error: code = NotFound desc = could not find container \"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96\": container with ID starting with 4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96 not found: ID does not exist" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.319542 4740 scope.go:117] "RemoveContainer" containerID="ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2" Jan 05 14:08:39 crc kubenswrapper[4740]: E0105 14:08:39.319929 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2\": container with ID starting with ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2 not found: ID does not exist" containerID="ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.319947 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2"} err="failed to get container status \"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2\": rpc error: code = NotFound desc = could not find container \"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2\": container with ID starting with ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2 not found: ID does not exist" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.319960 4740 scope.go:117] "RemoveContainer" containerID="4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.320191 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96"} err="failed to get container status \"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96\": rpc error: code = NotFound desc = could not find container \"4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96\": container with ID starting with 4b3e3d669a24f0cc0f75f8d00d00dbd9e455cc4edc22757e4cc576265d33db96 not found: ID does not exist" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.320210 4740 scope.go:117] "RemoveContainer" containerID="ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.320608 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2"} err="failed to get container status \"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2\": rpc error: code = NotFound desc = could not find container \"ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2\": container with ID starting with ddc231d1ff483b65f447ac8c191774cc390b10b6f73f953106a0bc6c43696fe2 not found: ID does not exist" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406246 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-config-data\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406390 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406466 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406502 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-scripts\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406585 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-logs\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406622 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whnpg\" (UniqueName: \"kubernetes.io/projected/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-kube-api-access-whnpg\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406667 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.406693 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.507933 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.507973 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.508025 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-config-data\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.508420 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.508759 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.508818 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.508847 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-scripts\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.510213 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-logs\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.510265 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whnpg\" (UniqueName: \"kubernetes.io/projected/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-kube-api-access-whnpg\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.510694 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-logs\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.511065 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.511108 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f181c6c175c062630ff3c011c7f7b401c107aabb3ab99d5fa4b9467f0f8d5d6/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.512948 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-scripts\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.513325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-config-data\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.517512 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.517520 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.526869 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whnpg\" (UniqueName: \"kubernetes.io/projected/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-kube-api-access-whnpg\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.561973 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " pod="openstack/glance-default-external-api-0" Jan 05 14:08:39 crc kubenswrapper[4740]: I0105 14:08:39.584629 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:08:40 crc kubenswrapper[4740]: I0105 14:08:40.186546 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:08:40 crc kubenswrapper[4740]: I0105 14:08:40.997642 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319db572-e2af-4b65-95b2-4ebd62d38c4a" path="/var/lib/kubelet/pods/319db572-e2af-4b65-95b2-4ebd62d38c4a/volumes" Jan 05 14:08:41 crc kubenswrapper[4740]: I0105 14:08:41.223377 4740 generic.go:334] "Generic (PLEG): container finished" podID="7c79ced0-0772-4ae5-93e6-3410088d5d65" containerID="a7dea7bce463deeb0849a33a3a74c8beb5e29f29e178831abb5786dd73f8fca7" exitCode=0 Jan 05 14:08:41 crc kubenswrapper[4740]: I0105 14:08:41.223739 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h6jzd" event={"ID":"7c79ced0-0772-4ae5-93e6-3410088d5d65","Type":"ContainerDied","Data":"a7dea7bce463deeb0849a33a3a74c8beb5e29f29e178831abb5786dd73f8fca7"} Jan 05 14:08:44 crc kubenswrapper[4740]: I0105 14:08:44.290018 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:08:44 crc kubenswrapper[4740]: I0105 14:08:44.377040 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vj867"] Jan 05 14:08:44 crc kubenswrapper[4740]: I0105 14:08:44.377344 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" containerID="cri-o://edf6bcae04123e30ad5daccfc24ec9e527589649433100b25317cc7e703fe4ea" gracePeriod=10 Jan 05 14:08:44 crc kubenswrapper[4740]: E0105 14:08:44.760876 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:45 crc kubenswrapper[4740]: I0105 14:08:45.267391 4740 generic.go:334] "Generic (PLEG): container finished" podID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerID="edf6bcae04123e30ad5daccfc24ec9e527589649433100b25317cc7e703fe4ea" exitCode=0 Jan 05 14:08:45 crc kubenswrapper[4740]: I0105 14:08:45.267464 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" event={"ID":"9e329289-66fc-459b-8360-f0e8df0edcc9","Type":"ContainerDied","Data":"edf6bcae04123e30ad5daccfc24ec9e527589649433100b25317cc7e703fe4ea"} Jan 05 14:08:45 crc kubenswrapper[4740]: I0105 14:08:45.408822 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: connect: connection refused" Jan 05 14:08:47 crc kubenswrapper[4740]: E0105 14:08:47.315990 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:48 crc kubenswrapper[4740]: E0105 14:08:48.108449 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:48 crc kubenswrapper[4740]: E0105 14:08:48.111938 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:50 crc kubenswrapper[4740]: I0105 14:08:50.409281 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: connect: connection refused" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.257265 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.346868 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgctl\" (UniqueName: \"kubernetes.io/projected/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-kube-api-access-dgctl\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.346916 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-config-data\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.346947 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-logs\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.346988 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-httpd-run\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.347047 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-combined-ca-bundle\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.347164 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-scripts\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.347276 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.347456 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-internal-tls-certs\") pod \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\" (UID: \"bdc3dd07-42ff-46f5-a41b-b49d590b65b2\") " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.348864 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-logs" (OuterVolumeSpecName: "logs") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.349137 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.354304 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-scripts" (OuterVolumeSpecName: "scripts") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.355303 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-kube-api-access-dgctl" (OuterVolumeSpecName: "kube-api-access-dgctl") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "kube-api-access-dgctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.367674 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57" (OuterVolumeSpecName: "glance") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.378634 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bdc3dd07-42ff-46f5-a41b-b49d590b65b2","Type":"ContainerDied","Data":"ce2339c290f7a431d794847142ff24b308b71f976e3dfc5749731712870de336"} Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.378715 4740 scope.go:117] "RemoveContainer" containerID="883663c4dc8a569943b94ba10a5d5c4f465b672a3e2ef491a6c3f8f1c4df59e7" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.378942 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.384503 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ecdaebe5-8250-4c27-a69a-3b1ebd335a48","Type":"ContainerStarted","Data":"66bbf1604c30e7418cda7085cbf1cfedb8a4aec4860ef67fd075977465896b00"} Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.409097 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.438574 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.439762 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-config-data" (OuterVolumeSpecName: "config-data") pod "bdc3dd07-42ff-46f5-a41b-b49d590b65b2" (UID: "bdc3dd07-42ff-46f5-a41b-b49d590b65b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450328 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450582 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") on node \"crc\" " Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450658 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450722 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgctl\" (UniqueName: \"kubernetes.io/projected/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-kube-api-access-dgctl\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450781 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450837 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450895 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.450956 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc3dd07-42ff-46f5-a41b-b49d590b65b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.485357 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.485986 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57") on node "crc" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.553420 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") on node \"crc\" DevicePath \"\"" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.727001 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.742138 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.765499 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:52 crc kubenswrapper[4740]: E0105 14:08:52.766020 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-httpd" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.766038 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-httpd" Jan 05 14:08:52 crc kubenswrapper[4740]: E0105 14:08:52.766053 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-log" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.766075 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-log" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.766319 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-httpd" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.766341 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" containerName="glance-log" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.767626 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.780592 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.780853 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.795407 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858509 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858557 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858587 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858611 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858631 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858657 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.858876 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtwx\" (UniqueName: \"kubernetes.io/projected/cfe7c040-25ec-49bc-96a4-127e4281ae18-kube-api-access-cbtwx\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.859055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-logs\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.961948 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962039 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962121 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962193 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962223 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962287 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962411 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtwx\" (UniqueName: \"kubernetes.io/projected/cfe7c040-25ec-49bc-96a4-127e4281ae18-kube-api-access-cbtwx\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.962494 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-logs\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.963422 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.963434 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-logs\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.966740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.967442 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.968651 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.968720 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.968838 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.968886 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f94b1d52da0ffd2f3c454948afdab6168cc98b014f61ee88f5256307d60f00b0/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.981436 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc3dd07-42ff-46f5-a41b-b49d590b65b2" path="/var/lib/kubelet/pods/bdc3dd07-42ff-46f5-a41b-b49d590b65b2/volumes" Jan 05 14:08:52 crc kubenswrapper[4740]: I0105 14:08:52.983332 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtwx\" (UniqueName: \"kubernetes.io/projected/cfe7c040-25ec-49bc-96a4-127e4281ae18-kube-api-access-cbtwx\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:53 crc kubenswrapper[4740]: I0105 14:08:53.024198 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:08:53 crc kubenswrapper[4740]: I0105 14:08:53.103826 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:08:54 crc kubenswrapper[4740]: I0105 14:08:54.407219 4740 generic.go:334] "Generic (PLEG): container finished" podID="60a4f2a8-8fa7-44b6-a6bc-15531c720f24" containerID="acbed7293a2f8b258573fd52eb80726a34d8891f6ade563484839fa998b70910" exitCode=0 Jan 05 14:08:54 crc kubenswrapper[4740]: I0105 14:08:54.407373 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rkvws" event={"ID":"60a4f2a8-8fa7-44b6-a6bc-15531c720f24","Type":"ContainerDied","Data":"acbed7293a2f8b258573fd52eb80726a34d8891f6ade563484839fa998b70910"} Jan 05 14:08:55 crc kubenswrapper[4740]: I0105 14:08:55.408932 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: connect: connection refused" Jan 05 14:08:55 crc kubenswrapper[4740]: I0105 14:08:55.409301 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:08:57 crc kubenswrapper[4740]: E0105 14:08:57.602183 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:08:59 crc kubenswrapper[4740]: E0105 14:08:59.425797 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:09:00 crc kubenswrapper[4740]: I0105 14:09:00.409236 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: connect: connection refused" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.008830 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.034540 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-credential-keys\") pod \"7c79ced0-0772-4ae5-93e6-3410088d5d65\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.034634 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-combined-ca-bundle\") pod \"7c79ced0-0772-4ae5-93e6-3410088d5d65\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.034675 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht85\" (UniqueName: \"kubernetes.io/projected/7c79ced0-0772-4ae5-93e6-3410088d5d65-kube-api-access-mht85\") pod \"7c79ced0-0772-4ae5-93e6-3410088d5d65\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.034766 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-fernet-keys\") pod \"7c79ced0-0772-4ae5-93e6-3410088d5d65\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.034826 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-config-data\") pod \"7c79ced0-0772-4ae5-93e6-3410088d5d65\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.034869 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-scripts\") pod \"7c79ced0-0772-4ae5-93e6-3410088d5d65\" (UID: \"7c79ced0-0772-4ae5-93e6-3410088d5d65\") " Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.046107 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-scripts" (OuterVolumeSpecName: "scripts") pod "7c79ced0-0772-4ae5-93e6-3410088d5d65" (UID: "7c79ced0-0772-4ae5-93e6-3410088d5d65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.049450 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7c79ced0-0772-4ae5-93e6-3410088d5d65" (UID: "7c79ced0-0772-4ae5-93e6-3410088d5d65"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.049663 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7c79ced0-0772-4ae5-93e6-3410088d5d65" (UID: "7c79ced0-0772-4ae5-93e6-3410088d5d65"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.052747 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c79ced0-0772-4ae5-93e6-3410088d5d65-kube-api-access-mht85" (OuterVolumeSpecName: "kube-api-access-mht85") pod "7c79ced0-0772-4ae5-93e6-3410088d5d65" (UID: "7c79ced0-0772-4ae5-93e6-3410088d5d65"). InnerVolumeSpecName "kube-api-access-mht85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.098840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-config-data" (OuterVolumeSpecName: "config-data") pod "7c79ced0-0772-4ae5-93e6-3410088d5d65" (UID: "7c79ced0-0772-4ae5-93e6-3410088d5d65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.129154 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c79ced0-0772-4ae5-93e6-3410088d5d65" (UID: "7c79ced0-0772-4ae5-93e6-3410088d5d65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.137031 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.137086 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht85\" (UniqueName: \"kubernetes.io/projected/7c79ced0-0772-4ae5-93e6-3410088d5d65-kube-api-access-mht85\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.137099 4740 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.137129 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.137138 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.137147 4740 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7c79ced0-0772-4ae5-93e6-3410088d5d65-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.513354 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h6jzd" event={"ID":"7c79ced0-0772-4ae5-93e6-3410088d5d65","Type":"ContainerDied","Data":"1a79e0c1f882b1c660d00a626e44d5bda0d9052be8b64cd99641c1b67f2423fd"} Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.513394 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a79e0c1f882b1c660d00a626e44d5bda0d9052be8b64cd99641c1b67f2423fd" Jan 05 14:09:01 crc kubenswrapper[4740]: I0105 14:09:01.513426 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h6jzd" Jan 05 14:09:01 crc kubenswrapper[4740]: E0105 14:09:01.808401 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 05 14:09:01 crc kubenswrapper[4740]: E0105 14:09:01.808814 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qlpvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-b2trl_openstack(401ab705-cb09-4760-840d-2b99b6c9148c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:09:01 crc kubenswrapper[4740]: E0105 14:09:01.810234 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-b2trl" podUID="401ab705-cb09-4760-840d-2b99b6c9148c" Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.085420 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.085588 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tm2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-979vn_openstack(ed5f3fd5-cc0b-42c1-8cee-b28452adde1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.086794 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-979vn" podUID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.127314 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-h6jzd"] Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.142130 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-h6jzd"] Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.200530 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rkvws" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.203048 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d6mcw"] Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.203594 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79ced0-0772-4ae5-93e6-3410088d5d65" containerName="keystone-bootstrap" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.203612 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79ced0-0772-4ae5-93e6-3410088d5d65" containerName="keystone-bootstrap" Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.203649 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a4f2a8-8fa7-44b6-a6bc-15531c720f24" containerName="neutron-db-sync" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.203658 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a4f2a8-8fa7-44b6-a6bc-15531c720f24" containerName="neutron-db-sync" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.203876 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c79ced0-0772-4ae5-93e6-3410088d5d65" containerName="keystone-bootstrap" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.203895 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a4f2a8-8fa7-44b6-a6bc-15531c720f24" containerName="neutron-db-sync" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.204651 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.208496 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.208693 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.208928 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.209263 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wktmq" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.209489 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.212434 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d6mcw"] Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265114 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffhq7\" (UniqueName: \"kubernetes.io/projected/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-kube-api-access-ffhq7\") pod \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265168 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-config\") pod \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265216 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-combined-ca-bundle\") pod \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\" (UID: \"60a4f2a8-8fa7-44b6-a6bc-15531c720f24\") " Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265426 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6jwp\" (UniqueName: \"kubernetes.io/projected/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-kube-api-access-w6jwp\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265490 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-config-data\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265521 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-credential-keys\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265563 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-combined-ca-bundle\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265588 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-fernet-keys\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.265636 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-scripts\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.270824 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-kube-api-access-ffhq7" (OuterVolumeSpecName: "kube-api-access-ffhq7") pod "60a4f2a8-8fa7-44b6-a6bc-15531c720f24" (UID: "60a4f2a8-8fa7-44b6-a6bc-15531c720f24"). InnerVolumeSpecName "kube-api-access-ffhq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.302631 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60a4f2a8-8fa7-44b6-a6bc-15531c720f24" (UID: "60a4f2a8-8fa7-44b6-a6bc-15531c720f24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.303140 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-config" (OuterVolumeSpecName: "config") pod "60a4f2a8-8fa7-44b6-a6bc-15531c720f24" (UID: "60a4f2a8-8fa7-44b6-a6bc-15531c720f24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366189 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-config-data\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366246 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-credential-keys\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366289 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-combined-ca-bundle\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366318 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-fernet-keys\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366363 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-scripts\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366435 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6jwp\" (UniqueName: \"kubernetes.io/projected/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-kube-api-access-w6jwp\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366496 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffhq7\" (UniqueName: \"kubernetes.io/projected/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-kube-api-access-ffhq7\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366508 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.366516 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a4f2a8-8fa7-44b6-a6bc-15531c720f24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.371317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-fernet-keys\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.372539 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-scripts\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.373389 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-config-data\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.374499 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-combined-ca-bundle\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.374947 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-credential-keys\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.383748 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6jwp\" (UniqueName: \"kubernetes.io/projected/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-kube-api-access-w6jwp\") pod \"keystone-bootstrap-d6mcw\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.522747 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rkvws" event={"ID":"60a4f2a8-8fa7-44b6-a6bc-15531c720f24","Type":"ContainerDied","Data":"2c492cbab45df40b9522ccf66c76de22e064c2f5d37393c874a61bda024969ee"} Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.523114 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c492cbab45df40b9522ccf66c76de22e064c2f5d37393c874a61bda024969ee" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.522875 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rkvws" Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.525038 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-b2trl" podUID="401ab705-cb09-4760-840d-2b99b6c9148c" Jan 05 14:09:02 crc kubenswrapper[4740]: E0105 14:09:02.525198 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-979vn" podUID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.537887 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:02 crc kubenswrapper[4740]: I0105 14:09:02.980281 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c79ced0-0772-4ae5-93e6-3410088d5d65" path="/var/lib/kubelet/pods/7c79ced0-0772-4ae5-93e6-3410088d5d65/volumes" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.432721 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t7gd6"] Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.435323 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.451427 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t7gd6"] Jan 05 14:09:03 crc kubenswrapper[4740]: E0105 14:09:03.505412 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 05 14:09:03 crc kubenswrapper[4740]: E0105 14:09:03.505588 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54rz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lzmfj_openstack(333dfe82-8fdd-400e-8b8c-89906d81e778): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:09:03 crc kubenswrapper[4740]: E0105 14:09:03.506727 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lzmfj" podUID="333dfe82-8fdd-400e-8b8c-89906d81e778" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.519544 4740 scope.go:117] "RemoveContainer" containerID="988908b547087a37bfdcfad4847887ce39768efc0a873243d2419114e0b56181" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.582317 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" event={"ID":"9e329289-66fc-459b-8360-f0e8df0edcc9","Type":"ContainerDied","Data":"72985004c9d03c1c6ff727665bb1b0f6496c634b8db8d360b2c3fabca93429e4"} Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.582365 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72985004c9d03c1c6ff727665bb1b0f6496c634b8db8d360b2c3fabca93429e4" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.599859 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.601815 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-svc\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.601928 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.604433 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-config\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.604480 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhkww\" (UniqueName: \"kubernetes.io/projected/93830ee5-2e0a-4f1e-9234-fa45767e0391-kube-api-access-dhkww\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.604615 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.622915 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56585fbbf8-r9sc8"] Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.625155 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.628961 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.629124 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.629173 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qgz2z" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.630008 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.635555 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56585fbbf8-r9sc8"] Jan 05 14:09:03 crc kubenswrapper[4740]: E0105 14:09:03.645737 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lzmfj" podUID="333dfe82-8fdd-400e-8b8c-89906d81e778" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.706311 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.707891 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.707971 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-svc\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.708464 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.709448 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.708871 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-svc\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.710679 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhkww\" (UniqueName: \"kubernetes.io/projected/93830ee5-2e0a-4f1e-9234-fa45767e0391-kube-api-access-dhkww\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.710750 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-config\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.710867 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.712187 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.718974 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-config\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.734608 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhkww\" (UniqueName: \"kubernetes.io/projected/93830ee5-2e0a-4f1e-9234-fa45767e0391-kube-api-access-dhkww\") pod \"dnsmasq-dns-6b7b667979-t7gd6\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.736673 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.757355 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.814723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-ovndb-tls-certs\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.814766 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-combined-ca-bundle\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.814817 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4smpw\" (UniqueName: \"kubernetes.io/projected/390a9292-95a7-457f-b300-e8613949150c-kube-api-access-4smpw\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.814956 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-config\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.815037 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-httpd-config\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.916225 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-config\") pod \"9e329289-66fc-459b-8360-f0e8df0edcc9\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.916636 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-nb\") pod \"9e329289-66fc-459b-8360-f0e8df0edcc9\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.916684 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-svc\") pod \"9e329289-66fc-459b-8360-f0e8df0edcc9\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.916771 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-sb\") pod \"9e329289-66fc-459b-8360-f0e8df0edcc9\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.916912 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-swift-storage-0\") pod \"9e329289-66fc-459b-8360-f0e8df0edcc9\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.917019 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8bn\" (UniqueName: \"kubernetes.io/projected/9e329289-66fc-459b-8360-f0e8df0edcc9-kube-api-access-6w8bn\") pod \"9e329289-66fc-459b-8360-f0e8df0edcc9\" (UID: \"9e329289-66fc-459b-8360-f0e8df0edcc9\") " Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.917431 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-config\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.917565 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-httpd-config\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.917628 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-combined-ca-bundle\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.917649 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-ovndb-tls-certs\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.917696 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4smpw\" (UniqueName: \"kubernetes.io/projected/390a9292-95a7-457f-b300-e8613949150c-kube-api-access-4smpw\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.938519 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e329289-66fc-459b-8360-f0e8df0edcc9-kube-api-access-6w8bn" (OuterVolumeSpecName: "kube-api-access-6w8bn") pod "9e329289-66fc-459b-8360-f0e8df0edcc9" (UID: "9e329289-66fc-459b-8360-f0e8df0edcc9"). InnerVolumeSpecName "kube-api-access-6w8bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.948593 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-combined-ca-bundle\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.961465 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-httpd-config\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.978171 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4smpw\" (UniqueName: \"kubernetes.io/projected/390a9292-95a7-457f-b300-e8613949150c-kube-api-access-4smpw\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:03 crc kubenswrapper[4740]: I0105 14:09:03.981766 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-config\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.006221 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-ovndb-tls-certs\") pod \"neutron-56585fbbf8-r9sc8\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.020283 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w8bn\" (UniqueName: \"kubernetes.io/projected/9e329289-66fc-459b-8360-f0e8df0edcc9-kube-api-access-6w8bn\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.084674 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.280314 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.438704 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d6mcw"] Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.672408 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t7gd6"] Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.700813 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfe7c040-25ec-49bc-96a4-127e4281ae18","Type":"ContainerStarted","Data":"b54f150460b0e76671945aef5c2a65167e930442b442e657e2411565130ff768"} Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.704358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6mcw" event={"ID":"62f3ba3c-1c87-449c-97e5-bfe06a37ef15","Type":"ContainerStarted","Data":"f85e53d5f65670e58c180e99c44cde9559dfa29e58f8adff2eb5019b066ca664"} Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.713043 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vj867" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.775709 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e329289-66fc-459b-8360-f0e8df0edcc9" (UID: "9e329289-66fc-459b-8360-f0e8df0edcc9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.870091 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.897611 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e329289-66fc-459b-8360-f0e8df0edcc9" (UID: "9e329289-66fc-459b-8360-f0e8df0edcc9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.919678 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9e329289-66fc-459b-8360-f0e8df0edcc9" (UID: "9e329289-66fc-459b-8360-f0e8df0edcc9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.921795 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-config" (OuterVolumeSpecName: "config") pod "9e329289-66fc-459b-8360-f0e8df0edcc9" (UID: "9e329289-66fc-459b-8360-f0e8df0edcc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.929714 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e329289-66fc-459b-8360-f0e8df0edcc9" (UID: "9e329289-66fc-459b-8360-f0e8df0edcc9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.971745 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.971784 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.971795 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:04 crc kubenswrapper[4740]: I0105 14:09:04.971806 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e329289-66fc-459b-8360-f0e8df0edcc9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:05 crc kubenswrapper[4740]: I0105 14:09:05.086292 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56585fbbf8-r9sc8"] Jan 05 14:09:05 crc kubenswrapper[4740]: I0105 14:09:05.101240 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vj867"] Jan 05 14:09:05 crc kubenswrapper[4740]: I0105 14:09:05.125048 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vj867"] Jan 05 14:09:05 crc kubenswrapper[4740]: I0105 14:09:05.723590 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" event={"ID":"93830ee5-2e0a-4f1e-9234-fa45767e0391","Type":"ContainerStarted","Data":"7818b658e70eb1602346b40e33e36f91fca67a1aa62518aa0f524280a0430bd2"} Jan 05 14:09:05 crc kubenswrapper[4740]: I0105 14:09:05.725735 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56585fbbf8-r9sc8" event={"ID":"390a9292-95a7-457f-b300-e8613949150c","Type":"ContainerStarted","Data":"07c5af9492de0374a0a283b8b29fbb1518c64d8ffcf62d7bc3f0a23e29a7a199"} Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.058252 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5797c5bd9-frslw"] Jan 05 14:09:06 crc kubenswrapper[4740]: E0105 14:09:06.059083 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="init" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.059183 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="init" Jan 05 14:09:06 crc kubenswrapper[4740]: E0105 14:09:06.059280 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.059403 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.059755 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" containerName="dnsmasq-dns" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.061394 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.068437 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.068616 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.084248 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5797c5bd9-frslw"] Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.194082 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-public-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.194437 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-internal-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.194472 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-ovndb-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.194682 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-combined-ca-bundle\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.195093 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-httpd-config\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.195143 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-config\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.195255 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlx65\" (UniqueName: \"kubernetes.io/projected/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-kube-api-access-nlx65\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297179 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-combined-ca-bundle\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297328 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-httpd-config\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297356 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-config\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlx65\" (UniqueName: \"kubernetes.io/projected/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-kube-api-access-nlx65\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297432 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-public-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297451 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-internal-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.297470 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-ovndb-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.303549 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-ovndb-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.303571 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-combined-ca-bundle\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.304249 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-httpd-config\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.304280 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-config\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.310147 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-internal-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.311740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-public-tls-certs\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.312946 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlx65\" (UniqueName: \"kubernetes.io/projected/713e1044-70d2-45c5-a6e7-dc2fcc24ed54-kube-api-access-nlx65\") pod \"neutron-5797c5bd9-frslw\" (UID: \"713e1044-70d2-45c5-a6e7-dc2fcc24ed54\") " pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.400387 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:06 crc kubenswrapper[4740]: I0105 14:09:06.989134 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e329289-66fc-459b-8360-f0e8df0edcc9" path="/var/lib/kubelet/pods/9e329289-66fc-459b-8360-f0e8df0edcc9/volumes" Jan 05 14:09:07 crc kubenswrapper[4740]: I0105 14:09:07.490650 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5797c5bd9-frslw"] Jan 05 14:09:07 crc kubenswrapper[4740]: I0105 14:09:07.757881 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5797c5bd9-frslw" event={"ID":"713e1044-70d2-45c5-a6e7-dc2fcc24ed54","Type":"ContainerStarted","Data":"ee090a7bbf3f1f8aae697fdc5b356ef1a630e66fc8bdaf79bdec105898dbe690"} Jan 05 14:09:07 crc kubenswrapper[4740]: E0105 14:09:07.854783 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbfdbc4_e4a5_46ed_9f3e_ccf21e081e25.slice/crio-e71125404067614ef9bed89c790fc73cf0d29094d494f14803b43598ebad842e\": RecentStats: unable to find data in memory cache]" Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.822345 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5797c5bd9-frslw" event={"ID":"713e1044-70d2-45c5-a6e7-dc2fcc24ed54","Type":"ContainerStarted","Data":"dc3cc5dacee2d99a9d69219a6b3906ad7ac0eb8b04f2405760ba8cf48a22d4d2"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.822679 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5797c5bd9-frslw" event={"ID":"713e1044-70d2-45c5-a6e7-dc2fcc24ed54","Type":"ContainerStarted","Data":"0b48c62abd2c7fc968d22b606f2bf18a8ae26ad9b291565832c24dae2c6b9c95"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.822709 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.832780 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfe7c040-25ec-49bc-96a4-127e4281ae18","Type":"ContainerStarted","Data":"31abe96eff881f2e28f34c9a99c430103b86b759958fd027c0a3fb182a9ad099"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.841763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ecdaebe5-8250-4c27-a69a-3b1ebd335a48","Type":"ContainerStarted","Data":"9d257f1cb533227dae1cbe31e1a03ab8d97935b72b69fb285052eddf1cb73c4b"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.843765 4740 generic.go:334] "Generic (PLEG): container finished" podID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerID="01962e81caa687ad962cda09a9a7505a3758222384270ac679b42c7797cb961d" exitCode=0 Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.843833 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" event={"ID":"93830ee5-2e0a-4f1e-9234-fa45767e0391","Type":"ContainerDied","Data":"01962e81caa687ad962cda09a9a7505a3758222384270ac679b42c7797cb961d"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.869772 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56585fbbf8-r9sc8" event={"ID":"390a9292-95a7-457f-b300-e8613949150c","Type":"ContainerStarted","Data":"137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.869820 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56585fbbf8-r9sc8" event={"ID":"390a9292-95a7-457f-b300-e8613949150c","Type":"ContainerStarted","Data":"ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.870743 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.881751 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5797c5bd9-frslw" podStartSLOduration=4.881735975 podStartE2EDuration="4.881735975s" podCreationTimestamp="2026-01-05 14:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:10.852370977 +0000 UTC m=+1200.159279556" watchObservedRunningTime="2026-01-05 14:09:10.881735975 +0000 UTC m=+1200.188644554" Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.882932 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kghbd" event={"ID":"9b7a41a0-75e3-4638-9856-90adffd28751","Type":"ContainerStarted","Data":"f70e30aea270148875810e4804dc878ee6f263cb4476436356e31e95a3d7a23e"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.890989 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerStarted","Data":"8e0d5775dc01bfc889f50d3f81e383edcfed499c68ec42c22984729c7a5e0863"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.909690 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6mcw" event={"ID":"62f3ba3c-1c87-449c-97e5-bfe06a37ef15","Type":"ContainerStarted","Data":"677464dd50895f81b81ed97f2d783b8a0c61200f9d99cb6d1b1d1bceafebb22f"} Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.917369 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56585fbbf8-r9sc8" podStartSLOduration=7.917350308 podStartE2EDuration="7.917350308s" podCreationTimestamp="2026-01-05 14:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:10.898596386 +0000 UTC m=+1200.205504965" watchObservedRunningTime="2026-01-05 14:09:10.917350308 +0000 UTC m=+1200.224258887" Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.939770 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-kghbd" podStartSLOduration=9.928092919000001 podStartE2EDuration="37.939750638s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="2026-01-05 14:08:35.448263927 +0000 UTC m=+1164.755172506" lastFinishedPulling="2026-01-05 14:09:03.459921646 +0000 UTC m=+1192.766830225" observedRunningTime="2026-01-05 14:09:10.922566498 +0000 UTC m=+1200.229475077" watchObservedRunningTime="2026-01-05 14:09:10.939750638 +0000 UTC m=+1200.246659217" Jan 05 14:09:10 crc kubenswrapper[4740]: I0105 14:09:10.946229 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d6mcw" podStartSLOduration=8.946213292 podStartE2EDuration="8.946213292s" podCreationTimestamp="2026-01-05 14:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:10.940624022 +0000 UTC m=+1200.247532601" watchObservedRunningTime="2026-01-05 14:09:10.946213292 +0000 UTC m=+1200.253121871" Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.922094 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" event={"ID":"93830ee5-2e0a-4f1e-9234-fa45767e0391","Type":"ContainerStarted","Data":"d9c76e366cf9c1556f910fbbc9ec510551a00adbb450d64e9c6f7af3c0e9d11b"} Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.922449 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.926057 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfe7c040-25ec-49bc-96a4-127e4281ae18","Type":"ContainerStarted","Data":"a64df19c895975cf0990e36d21938e7dea57cd874d75f3d8562b5365c131a1fc"} Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.930937 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ecdaebe5-8250-4c27-a69a-3b1ebd335a48","Type":"ContainerStarted","Data":"73c75d4459e537b669822a1c2b1e48e92d76fb08eb1b0b1fe00322a5b4afb4a6"} Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.950891 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" podStartSLOduration=8.950868987 podStartE2EDuration="8.950868987s" podCreationTimestamp="2026-01-05 14:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:11.943613373 +0000 UTC m=+1201.250521942" watchObservedRunningTime="2026-01-05 14:09:11.950868987 +0000 UTC m=+1201.257777576" Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.969645 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.96962528 podStartE2EDuration="19.96962528s" podCreationTimestamp="2026-01-05 14:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:11.966821075 +0000 UTC m=+1201.273729674" watchObservedRunningTime="2026-01-05 14:09:11.96962528 +0000 UTC m=+1201.276533879" Jan 05 14:09:11 crc kubenswrapper[4740]: I0105 14:09:11.993931 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=32.993915691 podStartE2EDuration="32.993915691s" podCreationTimestamp="2026-01-05 14:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:11.990967361 +0000 UTC m=+1201.297875940" watchObservedRunningTime="2026-01-05 14:09:11.993915691 +0000 UTC m=+1201.300824270" Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.104983 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.105421 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.147882 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.151779 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.951115 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerStarted","Data":"e59bce26fca4df9150498bb6e10543a8675137393eff56d64d9ec6771bc0ceb4"} Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.951484 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:13 crc kubenswrapper[4740]: I0105 14:09:13.951508 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:14 crc kubenswrapper[4740]: I0105 14:09:14.969259 4740 generic.go:334] "Generic (PLEG): container finished" podID="9b7a41a0-75e3-4638-9856-90adffd28751" containerID="f70e30aea270148875810e4804dc878ee6f263cb4476436356e31e95a3d7a23e" exitCode=0 Jan 05 14:09:14 crc kubenswrapper[4740]: I0105 14:09:14.982877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kghbd" event={"ID":"9b7a41a0-75e3-4638-9856-90adffd28751","Type":"ContainerDied","Data":"f70e30aea270148875810e4804dc878ee6f263cb4476436356e31e95a3d7a23e"} Jan 05 14:09:15 crc kubenswrapper[4740]: I0105 14:09:15.981975 4740 generic.go:334] "Generic (PLEG): container finished" podID="62f3ba3c-1c87-449c-97e5-bfe06a37ef15" containerID="677464dd50895f81b81ed97f2d783b8a0c61200f9d99cb6d1b1d1bceafebb22f" exitCode=0 Jan 05 14:09:15 crc kubenswrapper[4740]: I0105 14:09:15.982076 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6mcw" event={"ID":"62f3ba3c-1c87-449c-97e5-bfe06a37ef15","Type":"ContainerDied","Data":"677464dd50895f81b81ed97f2d783b8a0c61200f9d99cb6d1b1d1bceafebb22f"} Jan 05 14:09:18 crc kubenswrapper[4740]: I0105 14:09:18.764238 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:18 crc kubenswrapper[4740]: I0105 14:09:18.821549 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-glxvc"] Jan 05 14:09:18 crc kubenswrapper[4740]: I0105 14:09:18.821772 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="dnsmasq-dns" containerID="cri-o://dcb9b4f14b12e86a1db4aaf0888171fe96abe11fd1336ca0f511d258f29a07e5" gracePeriod=10 Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.071812 4740 generic.go:334] "Generic (PLEG): container finished" podID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerID="dcb9b4f14b12e86a1db4aaf0888171fe96abe11fd1336ca0f511d258f29a07e5" exitCode=0 Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.072106 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" event={"ID":"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04","Type":"ContainerDied","Data":"dcb9b4f14b12e86a1db4aaf0888171fe96abe11fd1336ca0f511d258f29a07e5"} Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.289411 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.193:5353: connect: connection refused" Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.592489 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.592553 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.631348 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 05 14:09:19 crc kubenswrapper[4740]: I0105 14:09:19.645438 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.094211 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.094245 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.325884 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kghbd" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.332988 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.437540 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-combined-ca-bundle\") pod \"9b7a41a0-75e3-4638-9856-90adffd28751\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.437626 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-scripts\") pod \"9b7a41a0-75e3-4638-9856-90adffd28751\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.437728 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7a41a0-75e3-4638-9856-90adffd28751-logs\") pod \"9b7a41a0-75e3-4638-9856-90adffd28751\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.437786 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-config-data\") pod \"9b7a41a0-75e3-4638-9856-90adffd28751\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.438004 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5kv\" (UniqueName: \"kubernetes.io/projected/9b7a41a0-75e3-4638-9856-90adffd28751-kube-api-access-vv5kv\") pod \"9b7a41a0-75e3-4638-9856-90adffd28751\" (UID: \"9b7a41a0-75e3-4638-9856-90adffd28751\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.445134 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7a41a0-75e3-4638-9856-90adffd28751-logs" (OuterVolumeSpecName: "logs") pod "9b7a41a0-75e3-4638-9856-90adffd28751" (UID: "9b7a41a0-75e3-4638-9856-90adffd28751"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.448738 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7a41a0-75e3-4638-9856-90adffd28751-kube-api-access-vv5kv" (OuterVolumeSpecName: "kube-api-access-vv5kv") pod "9b7a41a0-75e3-4638-9856-90adffd28751" (UID: "9b7a41a0-75e3-4638-9856-90adffd28751"). InnerVolumeSpecName "kube-api-access-vv5kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.466233 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-scripts" (OuterVolumeSpecName: "scripts") pod "9b7a41a0-75e3-4638-9856-90adffd28751" (UID: "9b7a41a0-75e3-4638-9856-90adffd28751"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.506291 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-config-data" (OuterVolumeSpecName: "config-data") pod "9b7a41a0-75e3-4638-9856-90adffd28751" (UID: "9b7a41a0-75e3-4638-9856-90adffd28751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.526174 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b7a41a0-75e3-4638-9856-90adffd28751" (UID: "9b7a41a0-75e3-4638-9856-90adffd28751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.551349 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-credential-keys\") pod \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.551420 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-combined-ca-bundle\") pod \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.551569 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-config-data\") pod \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.551621 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6jwp\" (UniqueName: \"kubernetes.io/projected/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-kube-api-access-w6jwp\") pod \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.551650 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-scripts\") pod \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.551668 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-fernet-keys\") pod \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\" (UID: \"62f3ba3c-1c87-449c-97e5-bfe06a37ef15\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.552218 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5kv\" (UniqueName: \"kubernetes.io/projected/9b7a41a0-75e3-4638-9856-90adffd28751-kube-api-access-vv5kv\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.552234 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.552255 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.552264 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7a41a0-75e3-4638-9856-90adffd28751-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.552272 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7a41a0-75e3-4638-9856-90adffd28751-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.555376 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "62f3ba3c-1c87-449c-97e5-bfe06a37ef15" (UID: "62f3ba3c-1c87-449c-97e5-bfe06a37ef15"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.565614 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "62f3ba3c-1c87-449c-97e5-bfe06a37ef15" (UID: "62f3ba3c-1c87-449c-97e5-bfe06a37ef15"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.575652 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-scripts" (OuterVolumeSpecName: "scripts") pod "62f3ba3c-1c87-449c-97e5-bfe06a37ef15" (UID: "62f3ba3c-1c87-449c-97e5-bfe06a37ef15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.578362 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-kube-api-access-w6jwp" (OuterVolumeSpecName: "kube-api-access-w6jwp") pod "62f3ba3c-1c87-449c-97e5-bfe06a37ef15" (UID: "62f3ba3c-1c87-449c-97e5-bfe06a37ef15"). InnerVolumeSpecName "kube-api-access-w6jwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.649403 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62f3ba3c-1c87-449c-97e5-bfe06a37ef15" (UID: "62f3ba3c-1c87-449c-97e5-bfe06a37ef15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.654396 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6jwp\" (UniqueName: \"kubernetes.io/projected/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-kube-api-access-w6jwp\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.654427 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.654436 4740 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.654445 4740 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.654453 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.707953 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-config-data" (OuterVolumeSpecName: "config-data") pod "62f3ba3c-1c87-449c-97e5-bfe06a37ef15" (UID: "62f3ba3c-1c87-449c-97e5-bfe06a37ef15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.767241 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f3ba3c-1c87-449c-97e5-bfe06a37ef15-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.803487 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.970797 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-config\") pod \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.971112 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-svc\") pod \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.971289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh69w\" (UniqueName: \"kubernetes.io/projected/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-kube-api-access-nh69w\") pod \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.971464 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-sb\") pod \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.971590 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-nb\") pod \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.971664 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-swift-storage-0\") pod \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\" (UID: \"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04\") " Jan 05 14:09:20 crc kubenswrapper[4740]: I0105 14:09:20.975370 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-kube-api-access-nh69w" (OuterVolumeSpecName: "kube-api-access-nh69w") pod "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" (UID: "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04"). InnerVolumeSpecName "kube-api-access-nh69w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.028435 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" (UID: "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.051483 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-config" (OuterVolumeSpecName: "config") pod "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" (UID: "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.057844 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" (UID: "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.072645 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" (UID: "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.074005 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.074038 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.074050 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.074073 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.074082 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh69w\" (UniqueName: \"kubernetes.io/projected/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-kube-api-access-nh69w\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.076805 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" (UID: "8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.113128 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b2trl" event={"ID":"401ab705-cb09-4760-840d-2b99b6c9148c","Type":"ContainerStarted","Data":"eead28032c33e30b0db0e6f9a76d49d68f79d008b0f8d2505a89b617523bf27d"} Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.115698 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-979vn" event={"ID":"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d","Type":"ContainerStarted","Data":"550c71fbe0bc023fb2dd62de14263da5cec9c1da6bf1688f949207f404bec575"} Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.120107 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" event={"ID":"8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04","Type":"ContainerDied","Data":"f5e973165c2340b975171abe1645edbbdefed2ba25d6fb4fed6763df346a7a62"} Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.120149 4740 scope.go:117] "RemoveContainer" containerID="dcb9b4f14b12e86a1db4aaf0888171fe96abe11fd1336ca0f511d258f29a07e5" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.120219 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-glxvc" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.135994 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerStarted","Data":"c0da9ed00a6d44e62107d51f61fe908b308e041d3d119c398aa18734f6abff84"} Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.136441 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-b2trl" podStartSLOduration=3.35246142 podStartE2EDuration="48.136423967s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="2026-01-05 14:08:35.556071465 +0000 UTC m=+1164.862980044" lastFinishedPulling="2026-01-05 14:09:20.340034012 +0000 UTC m=+1209.646942591" observedRunningTime="2026-01-05 14:09:21.132896693 +0000 UTC m=+1210.439805272" watchObservedRunningTime="2026-01-05 14:09:21.136423967 +0000 UTC m=+1210.443332546" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.146406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kghbd" event={"ID":"9b7a41a0-75e3-4638-9856-90adffd28751","Type":"ContainerDied","Data":"5d23bb36f7939423b498cd785d8f02617b255e40d832f0ecc239c44210a4ba77"} Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.146516 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d23bb36f7939423b498cd785d8f02617b255e40d832f0ecc239c44210a4ba77" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.146658 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kghbd" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.150130 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d6mcw" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.150180 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d6mcw" event={"ID":"62f3ba3c-1c87-449c-97e5-bfe06a37ef15","Type":"ContainerDied","Data":"f85e53d5f65670e58c180e99c44cde9559dfa29e58f8adff2eb5019b066ca664"} Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.150683 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f85e53d5f65670e58c180e99c44cde9559dfa29e58f8adff2eb5019b066ca664" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.177575 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.178743 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-979vn" podStartSLOduration=2.880137337 podStartE2EDuration="48.17871933s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="2026-01-05 14:08:34.89816754 +0000 UTC m=+1164.205076119" lastFinishedPulling="2026-01-05 14:09:20.196749503 +0000 UTC m=+1209.503658112" observedRunningTime="2026-01-05 14:09:21.169378881 +0000 UTC m=+1210.476287460" watchObservedRunningTime="2026-01-05 14:09:21.17871933 +0000 UTC m=+1210.485627899" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.186128 4740 scope.go:117] "RemoveContainer" containerID="598942101b1e719c5f3b57d3c30d747213576872bf303f654f185c6fc64fb387" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.220826 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-glxvc"] Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.248412 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-glxvc"] Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.475666 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-85c7d9bc6-f85sn"] Jan 05 14:09:21 crc kubenswrapper[4740]: E0105 14:09:21.476054 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f3ba3c-1c87-449c-97e5-bfe06a37ef15" containerName="keystone-bootstrap" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476082 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f3ba3c-1c87-449c-97e5-bfe06a37ef15" containerName="keystone-bootstrap" Jan 05 14:09:21 crc kubenswrapper[4740]: E0105 14:09:21.476099 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="dnsmasq-dns" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476104 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="dnsmasq-dns" Jan 05 14:09:21 crc kubenswrapper[4740]: E0105 14:09:21.476122 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7a41a0-75e3-4638-9856-90adffd28751" containerName="placement-db-sync" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476128 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7a41a0-75e3-4638-9856-90adffd28751" containerName="placement-db-sync" Jan 05 14:09:21 crc kubenswrapper[4740]: E0105 14:09:21.476144 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="init" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476150 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="init" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476351 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f3ba3c-1c87-449c-97e5-bfe06a37ef15" containerName="keystone-bootstrap" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476370 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" containerName="dnsmasq-dns" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.476376 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7a41a0-75e3-4638-9856-90adffd28751" containerName="placement-db-sync" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.477439 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.480565 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.480723 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8bmxt" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.480780 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.481688 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.500370 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.508091 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85c7d9bc6-f85sn"] Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584636 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-config-data\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584688 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-combined-ca-bundle\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584781 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-scripts\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584815 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-internal-tls-certs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584864 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-public-tls-certs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584883 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl2g5\" (UniqueName: \"kubernetes.io/projected/802f617e-6b4c-4e6e-aeab-11fa157041b9-kube-api-access-dl2g5\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.584938 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/802f617e-6b4c-4e6e-aeab-11fa157041b9-logs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.618937 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b7fccd64c-7x22h"] Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.621094 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.624795 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.625002 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.625176 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.625302 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.625417 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wktmq" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.625896 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.647997 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b7fccd64c-7x22h"] Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.686887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-scripts\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.686975 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-internal-tls-certs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.687083 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-public-tls-certs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.687119 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl2g5\" (UniqueName: \"kubernetes.io/projected/802f617e-6b4c-4e6e-aeab-11fa157041b9-kube-api-access-dl2g5\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.687180 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/802f617e-6b4c-4e6e-aeab-11fa157041b9-logs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.687277 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-config-data\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.687329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-combined-ca-bundle\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.691456 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/802f617e-6b4c-4e6e-aeab-11fa157041b9-logs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.693738 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-internal-tls-certs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.694127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-combined-ca-bundle\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.709368 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-public-tls-certs\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.710412 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl2g5\" (UniqueName: \"kubernetes.io/projected/802f617e-6b4c-4e6e-aeab-11fa157041b9-kube-api-access-dl2g5\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.710632 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-scripts\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.722921 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/802f617e-6b4c-4e6e-aeab-11fa157041b9-config-data\") pod \"placement-85c7d9bc6-f85sn\" (UID: \"802f617e-6b4c-4e6e-aeab-11fa157041b9\") " pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.789869 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-scripts\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.789948 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-credential-keys\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.790002 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-config-data\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.790019 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-combined-ca-bundle\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.790288 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-public-tls-certs\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.790374 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-internal-tls-certs\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.790551 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-fernet-keys\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.790783 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttvq\" (UniqueName: \"kubernetes.io/projected/9a37e8fa-bd7c-4449-84d9-45067f3abff7-kube-api-access-7ttvq\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.797637 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.893931 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttvq\" (UniqueName: \"kubernetes.io/projected/9a37e8fa-bd7c-4449-84d9-45067f3abff7-kube-api-access-7ttvq\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894054 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-scripts\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894142 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-credential-keys\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-config-data\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894231 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-combined-ca-bundle\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894320 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-public-tls-certs\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894361 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-internal-tls-certs\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.894422 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-fernet-keys\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.898889 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-public-tls-certs\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.898985 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-credential-keys\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.899755 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-fernet-keys\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.900045 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-combined-ca-bundle\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.901532 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-scripts\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.902437 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-internal-tls-certs\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.902995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a37e8fa-bd7c-4449-84d9-45067f3abff7-config-data\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.909170 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttvq\" (UniqueName: \"kubernetes.io/projected/9a37e8fa-bd7c-4449-84d9-45067f3abff7-kube-api-access-7ttvq\") pod \"keystone-5b7fccd64c-7x22h\" (UID: \"9a37e8fa-bd7c-4449-84d9-45067f3abff7\") " pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:21 crc kubenswrapper[4740]: I0105 14:09:21.949131 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.190480 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lzmfj" event={"ID":"333dfe82-8fdd-400e-8b8c-89906d81e778","Type":"ContainerStarted","Data":"76bdb405b0441aa360ffd3974db41ffab710eb9b38e2e849c1d97b36aea7789a"} Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.208630 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.208658 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.221424 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lzmfj" podStartSLOduration=4.490244524 podStartE2EDuration="49.221389174s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="2026-01-05 14:08:35.607897484 +0000 UTC m=+1164.914806063" lastFinishedPulling="2026-01-05 14:09:20.339042134 +0000 UTC m=+1209.645950713" observedRunningTime="2026-01-05 14:09:22.211941242 +0000 UTC m=+1211.518849841" watchObservedRunningTime="2026-01-05 14:09:22.221389174 +0000 UTC m=+1211.528297753" Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.437226 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85c7d9bc6-f85sn"] Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.691953 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b7fccd64c-7x22h"] Jan 05 14:09:22 crc kubenswrapper[4740]: I0105 14:09:22.985742 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04" path="/var/lib/kubelet/pods/8d17c505-ebdd-4f5d-8a0f-e7b939a9ac04/volumes" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.230875 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b7fccd64c-7x22h" event={"ID":"9a37e8fa-bd7c-4449-84d9-45067f3abff7","Type":"ContainerStarted","Data":"dbaa8e65da4ace9543f621ed94964e47a6b0b82c0285ae6d1ce1bd6f92e40941"} Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.232367 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b7fccd64c-7x22h" event={"ID":"9a37e8fa-bd7c-4449-84d9-45067f3abff7","Type":"ContainerStarted","Data":"1318617796fc5bf15fcfa5845c1a8b0db4575bf7f43333d67c24b3fc454a357d"} Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.233399 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.253194 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85c7d9bc6-f85sn" event={"ID":"802f617e-6b4c-4e6e-aeab-11fa157041b9","Type":"ContainerStarted","Data":"0768df3e075873751b49a3682920a8124126768f87a5ebb035ebea482897a270"} Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.253269 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85c7d9bc6-f85sn" event={"ID":"802f617e-6b4c-4e6e-aeab-11fa157041b9","Type":"ContainerStarted","Data":"d60a5240a819bf591ff1aedf773a40e1d38b0e040221e366dbd1868861e3e8ba"} Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.253290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85c7d9bc6-f85sn" event={"ID":"802f617e-6b4c-4e6e-aeab-11fa157041b9","Type":"ContainerStarted","Data":"3876197a6b64f81d2c24ba038c3f7b66a16dafa6bf96a3525688637cfea41e4c"} Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.253340 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.253372 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.268243 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b7fccd64c-7x22h" podStartSLOduration=2.26822211 podStartE2EDuration="2.26822211s" podCreationTimestamp="2026-01-05 14:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:23.248080671 +0000 UTC m=+1212.554989250" watchObservedRunningTime="2026-01-05 14:09:23.26822211 +0000 UTC m=+1212.575130689" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.913461 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.913633 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.953580 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-85c7d9bc6-f85sn" podStartSLOduration=2.9535565310000003 podStartE2EDuration="2.953556531s" podCreationTimestamp="2026-01-05 14:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:23.283658884 +0000 UTC m=+1212.590567483" watchObservedRunningTime="2026-01-05 14:09:23.953556531 +0000 UTC m=+1213.260465110" Jan 05 14:09:23 crc kubenswrapper[4740]: I0105 14:09:23.978280 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 05 14:09:25 crc kubenswrapper[4740]: I0105 14:09:25.278636 4740 generic.go:334] "Generic (PLEG): container finished" podID="401ab705-cb09-4760-840d-2b99b6c9148c" containerID="eead28032c33e30b0db0e6f9a76d49d68f79d008b0f8d2505a89b617523bf27d" exitCode=0 Jan 05 14:09:25 crc kubenswrapper[4740]: I0105 14:09:25.280202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b2trl" event={"ID":"401ab705-cb09-4760-840d-2b99b6c9148c","Type":"ContainerDied","Data":"eead28032c33e30b0db0e6f9a76d49d68f79d008b0f8d2505a89b617523bf27d"} Jan 05 14:09:25 crc kubenswrapper[4740]: I0105 14:09:25.356001 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:25 crc kubenswrapper[4740]: I0105 14:09:25.361530 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.800314 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b2trl" Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.922692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlpvt\" (UniqueName: \"kubernetes.io/projected/401ab705-cb09-4760-840d-2b99b6c9148c-kube-api-access-qlpvt\") pod \"401ab705-cb09-4760-840d-2b99b6c9148c\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.922890 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-db-sync-config-data\") pod \"401ab705-cb09-4760-840d-2b99b6c9148c\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.922938 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-combined-ca-bundle\") pod \"401ab705-cb09-4760-840d-2b99b6c9148c\" (UID: \"401ab705-cb09-4760-840d-2b99b6c9148c\") " Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.929939 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "401ab705-cb09-4760-840d-2b99b6c9148c" (UID: "401ab705-cb09-4760-840d-2b99b6c9148c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.932536 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401ab705-cb09-4760-840d-2b99b6c9148c-kube-api-access-qlpvt" (OuterVolumeSpecName: "kube-api-access-qlpvt") pod "401ab705-cb09-4760-840d-2b99b6c9148c" (UID: "401ab705-cb09-4760-840d-2b99b6c9148c"). InnerVolumeSpecName "kube-api-access-qlpvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:26 crc kubenswrapper[4740]: I0105 14:09:26.973185 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "401ab705-cb09-4760-840d-2b99b6c9148c" (UID: "401ab705-cb09-4760-840d-2b99b6c9148c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.026234 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlpvt\" (UniqueName: \"kubernetes.io/projected/401ab705-cb09-4760-840d-2b99b6c9148c-kube-api-access-qlpvt\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.026266 4740 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.026276 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401ab705-cb09-4760-840d-2b99b6c9148c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.302171 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" containerID="550c71fbe0bc023fb2dd62de14263da5cec9c1da6bf1688f949207f404bec575" exitCode=0 Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.302264 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-979vn" event={"ID":"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d","Type":"ContainerDied","Data":"550c71fbe0bc023fb2dd62de14263da5cec9c1da6bf1688f949207f404bec575"} Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.304259 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b2trl" event={"ID":"401ab705-cb09-4760-840d-2b99b6c9148c","Type":"ContainerDied","Data":"a45320334ab3d9d48b06605805aaad6f7dfa1145f29060a0b7bf6720736382fa"} Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.304297 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a45320334ab3d9d48b06605805aaad6f7dfa1145f29060a0b7bf6720736382fa" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.304315 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b2trl" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.542184 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7f99fddc9f-j57sh"] Jan 05 14:09:27 crc kubenswrapper[4740]: E0105 14:09:27.542599 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401ab705-cb09-4760-840d-2b99b6c9148c" containerName="barbican-db-sync" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.542614 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="401ab705-cb09-4760-840d-2b99b6c9148c" containerName="barbican-db-sync" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.542862 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="401ab705-cb09-4760-840d-2b99b6c9148c" containerName="barbican-db-sync" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.543970 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.550051 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-mgm5g" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.550288 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.550422 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.560429 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f99fddc9f-j57sh"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.631687 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c988b84c4-q8wdn"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.633552 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.648798 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c988b84c4-q8wdn"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.648960 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.650979 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-config-data-custom\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.651124 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-combined-ca-bundle\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.651222 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrhf2\" (UniqueName: \"kubernetes.io/projected/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-kube-api-access-rrhf2\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.651336 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-config-data\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.651446 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-logs\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.704041 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mskbh"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.705932 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.730182 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mskbh"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754432 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754485 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-logs\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754512 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-combined-ca-bundle\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754535 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754575 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrhf2\" (UniqueName: \"kubernetes.io/projected/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-kube-api-access-rrhf2\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754597 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-config-data\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754659 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-config-data\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754684 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-combined-ca-bundle\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754725 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq722\" (UniqueName: \"kubernetes.io/projected/e67f8caf-f4af-41a6-bb90-07687d1e2c70-kube-api-access-wq722\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754761 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754787 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-logs\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754828 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-config\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754854 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-config-data-custom\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754880 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754900 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-config-data-custom\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.754922 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpwhz\" (UniqueName: \"kubernetes.io/projected/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-kube-api-access-xpwhz\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.756330 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-logs\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.763886 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-config-data-custom\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.768419 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-combined-ca-bundle\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.778111 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-config-data\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.780743 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrhf2\" (UniqueName: \"kubernetes.io/projected/98684691-8ef2-4cb5-85e2-fe5913a5b3c0-kube-api-access-rrhf2\") pod \"barbican-keystone-listener-7c988b84c4-q8wdn\" (UID: \"98684691-8ef2-4cb5-85e2-fe5913a5b3c0\") " pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.856120 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b5964cd58-f9xhr"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.857936 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.858964 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b5964cd58-f9xhr"] Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861327 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-combined-ca-bundle\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq722\" (UniqueName: \"kubernetes.io/projected/e67f8caf-f4af-41a6-bb90-07687d1e2c70-kube-api-access-wq722\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861402 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861444 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-config\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861464 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-config-data-custom\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpwhz\" (UniqueName: \"kubernetes.io/projected/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-kube-api-access-xpwhz\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861532 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861562 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0001a65f-37dd-467e-b526-921e55a3152f-logs\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861585 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861607 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-combined-ca-bundle\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861625 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-logs\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861645 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861683 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-config-data\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861704 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data-custom\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.861732 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5s7q\" (UniqueName: \"kubernetes.io/projected/0001a65f-37dd-467e-b526-921e55a3152f-kube-api-access-p5s7q\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.864821 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-config\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.865555 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.865932 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-logs\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.867441 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.869334 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-config-data-custom\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.874973 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-combined-ca-bundle\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.877342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-config-data\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.890736 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.899317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpwhz\" (UniqueName: \"kubernetes.io/projected/6e339bc8-cbcb-48ac-84bd-ad37fcd552c0-kube-api-access-xpwhz\") pod \"barbican-worker-7f99fddc9f-j57sh\" (UID: \"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0\") " pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.904232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.904768 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq722\" (UniqueName: \"kubernetes.io/projected/e67f8caf-f4af-41a6-bb90-07687d1e2c70-kube-api-access-wq722\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.914394 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-mskbh\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.967322 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data-custom\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.967394 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5s7q\" (UniqueName: \"kubernetes.io/projected/0001a65f-37dd-467e-b526-921e55a3152f-kube-api-access-p5s7q\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.967483 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.967569 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.967613 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0001a65f-37dd-467e-b526-921e55a3152f-logs\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.967655 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-combined-ca-bundle\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.974492 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0001a65f-37dd-467e-b526-921e55a3152f-logs\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.978383 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-combined-ca-bundle\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.982963 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.985303 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data-custom\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:27 crc kubenswrapper[4740]: I0105 14:09:27.996810 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5s7q\" (UniqueName: \"kubernetes.io/projected/0001a65f-37dd-467e-b526-921e55a3152f-kube-api-access-p5s7q\") pod \"barbican-api-5b5964cd58-f9xhr\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:28 crc kubenswrapper[4740]: I0105 14:09:28.050587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:28 crc kubenswrapper[4740]: I0105 14:09:28.068448 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:28 crc kubenswrapper[4740]: I0105 14:09:28.172121 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f99fddc9f-j57sh" Jan 05 14:09:29 crc kubenswrapper[4740]: I0105 14:09:29.333338 4740 generic.go:334] "Generic (PLEG): container finished" podID="333dfe82-8fdd-400e-8b8c-89906d81e778" containerID="76bdb405b0441aa360ffd3974db41ffab710eb9b38e2e849c1d97b36aea7789a" exitCode=0 Jan 05 14:09:29 crc kubenswrapper[4740]: I0105 14:09:29.333420 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lzmfj" event={"ID":"333dfe82-8fdd-400e-8b8c-89906d81e778","Type":"ContainerDied","Data":"76bdb405b0441aa360ffd3974db41ffab710eb9b38e2e849c1d97b36aea7789a"} Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.630151 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dc98bb458-s9tqv"] Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.632296 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.637624 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.637833 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.641197 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dc98bb458-s9tqv"] Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.737843 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-config-data-custom\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.738207 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-config-data\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.738564 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-logs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.739836 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-internal-tls-certs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.739988 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmmw\" (UniqueName: \"kubernetes.io/projected/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-kube-api-access-wrmmw\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.740179 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-combined-ca-bundle\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.740365 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-public-tls-certs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843073 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-internal-tls-certs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrmmw\" (UniqueName: \"kubernetes.io/projected/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-kube-api-access-wrmmw\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843236 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-combined-ca-bundle\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843305 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-public-tls-certs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843380 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-config-data-custom\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843401 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-config-data\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843433 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-logs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.843939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-logs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.850274 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-config-data-custom\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.854587 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-internal-tls-certs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.861199 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-public-tls-certs\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.864355 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-config-data\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.864743 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-combined-ca-bundle\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.905151 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrmmw\" (UniqueName: \"kubernetes.io/projected/1f5ea36a-57ff-4d4e-ac3d-914d2278cd96-kube-api-access-wrmmw\") pod \"barbican-api-5dc98bb458-s9tqv\" (UID: \"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96\") " pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:30 crc kubenswrapper[4740]: I0105 14:09:30.960827 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:31 crc kubenswrapper[4740]: I0105 14:09:31.916279 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:09:31 crc kubenswrapper[4740]: I0105 14:09:31.916565 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.279132 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-979vn" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.288680 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.365892 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lzmfj" event={"ID":"333dfe82-8fdd-400e-8b8c-89906d81e778","Type":"ContainerDied","Data":"7448cdc041c0f54f9733f2ab65e6f420cae4e64f9d5114acc836aad83277d3ec"} Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.365921 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lzmfj" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.365941 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7448cdc041c0f54f9733f2ab65e6f420cae4e64f9d5114acc836aad83277d3ec" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.367700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-979vn" event={"ID":"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d","Type":"ContainerDied","Data":"3767ec1b5e43de8baff2bb383872df7fab367ac0f5f9a038f7151c61b39d0242"} Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.367730 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3767ec1b5e43de8baff2bb383872df7fab367ac0f5f9a038f7151c61b39d0242" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.367772 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-979vn" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.478566 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-combined-ca-bundle\") pod \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.478632 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-config-data\") pod \"333dfe82-8fdd-400e-8b8c-89906d81e778\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.478874 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-config-data\") pod \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.478918 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-scripts\") pod \"333dfe82-8fdd-400e-8b8c-89906d81e778\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.478962 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-combined-ca-bundle\") pod \"333dfe82-8fdd-400e-8b8c-89906d81e778\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.479009 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rz4\" (UniqueName: \"kubernetes.io/projected/333dfe82-8fdd-400e-8b8c-89906d81e778-kube-api-access-54rz4\") pod \"333dfe82-8fdd-400e-8b8c-89906d81e778\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.479108 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/333dfe82-8fdd-400e-8b8c-89906d81e778-etc-machine-id\") pod \"333dfe82-8fdd-400e-8b8c-89906d81e778\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.479245 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tm2w\" (UniqueName: \"kubernetes.io/projected/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-kube-api-access-2tm2w\") pod \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\" (UID: \"ed5f3fd5-cc0b-42c1-8cee-b28452adde1d\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.479290 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-db-sync-config-data\") pod \"333dfe82-8fdd-400e-8b8c-89906d81e778\" (UID: \"333dfe82-8fdd-400e-8b8c-89906d81e778\") " Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.479442 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/333dfe82-8fdd-400e-8b8c-89906d81e778-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "333dfe82-8fdd-400e-8b8c-89906d81e778" (UID: "333dfe82-8fdd-400e-8b8c-89906d81e778"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.480764 4740 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/333dfe82-8fdd-400e-8b8c-89906d81e778-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.483194 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-kube-api-access-2tm2w" (OuterVolumeSpecName: "kube-api-access-2tm2w") pod "ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" (UID: "ed5f3fd5-cc0b-42c1-8cee-b28452adde1d"). InnerVolumeSpecName "kube-api-access-2tm2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.489770 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "333dfe82-8fdd-400e-8b8c-89906d81e778" (UID: "333dfe82-8fdd-400e-8b8c-89906d81e778"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.490049 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-scripts" (OuterVolumeSpecName: "scripts") pod "333dfe82-8fdd-400e-8b8c-89906d81e778" (UID: "333dfe82-8fdd-400e-8b8c-89906d81e778"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.506275 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333dfe82-8fdd-400e-8b8c-89906d81e778-kube-api-access-54rz4" (OuterVolumeSpecName: "kube-api-access-54rz4") pod "333dfe82-8fdd-400e-8b8c-89906d81e778" (UID: "333dfe82-8fdd-400e-8b8c-89906d81e778"). InnerVolumeSpecName "kube-api-access-54rz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.533316 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" (UID: "ed5f3fd5-cc0b-42c1-8cee-b28452adde1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.542515 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-config-data" (OuterVolumeSpecName: "config-data") pod "333dfe82-8fdd-400e-8b8c-89906d81e778" (UID: "333dfe82-8fdd-400e-8b8c-89906d81e778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.544989 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "333dfe82-8fdd-400e-8b8c-89906d81e778" (UID: "333dfe82-8fdd-400e-8b8c-89906d81e778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.579930 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-config-data" (OuterVolumeSpecName: "config-data") pod "ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" (UID: "ed5f3fd5-cc0b-42c1-8cee-b28452adde1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591001 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tm2w\" (UniqueName: \"kubernetes.io/projected/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-kube-api-access-2tm2w\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591049 4740 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591080 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591094 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591106 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591117 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591129 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333dfe82-8fdd-400e-8b8c-89906d81e778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:32 crc kubenswrapper[4740]: I0105 14:09:32.591140 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54rz4\" (UniqueName: \"kubernetes.io/projected/333dfe82-8fdd-400e-8b8c-89906d81e778-kube-api-access-54rz4\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.381374 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerStarted","Data":"43b6b7c4584a0ece7548024dea78025680a0020db633f2707fc2e8fa25ba7e31"} Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.383532 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.381617 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="sg-core" containerID="cri-o://c0da9ed00a6d44e62107d51f61fe908b308e041d3d119c398aa18734f6abff84" gracePeriod=30 Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.381648 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-notification-agent" containerID="cri-o://e59bce26fca4df9150498bb6e10543a8675137393eff56d64d9ec6771bc0ceb4" gracePeriod=30 Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.381670 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="proxy-httpd" containerID="cri-o://43b6b7c4584a0ece7548024dea78025680a0020db633f2707fc2e8fa25ba7e31" gracePeriod=30 Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.381536 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-central-agent" containerID="cri-o://8e0d5775dc01bfc889f50d3f81e383edcfed499c68ec42c22984729c7a5e0863" gracePeriod=30 Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.444271 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.109587975 podStartE2EDuration="1m0.444251917s" podCreationTimestamp="2026-01-05 14:08:33 +0000 UTC" firstStartedPulling="2026-01-05 14:08:35.555742596 +0000 UTC m=+1164.862651165" lastFinishedPulling="2026-01-05 14:09:32.890406528 +0000 UTC m=+1222.197315107" observedRunningTime="2026-01-05 14:09:33.429610944 +0000 UTC m=+1222.736519523" watchObservedRunningTime="2026-01-05 14:09:33.444251917 +0000 UTC m=+1222.751160496" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.641872 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:33 crc kubenswrapper[4740]: E0105 14:09:33.642733 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" containerName="heat-db-sync" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.642921 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" containerName="heat-db-sync" Jan 05 14:09:33 crc kubenswrapper[4740]: E0105 14:09:33.643038 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333dfe82-8fdd-400e-8b8c-89906d81e778" containerName="cinder-db-sync" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.643144 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="333dfe82-8fdd-400e-8b8c-89906d81e778" containerName="cinder-db-sync" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.643497 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" containerName="heat-db-sync" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.643624 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="333dfe82-8fdd-400e-8b8c-89906d81e778" containerName="cinder-db-sync" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.645362 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.651585 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.651754 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gjqkx" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.656108 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.667820 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.671976 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.714112 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mskbh"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.744592 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.744653 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.744709 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.744732 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.744761 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d36e906-0206-4cda-913f-9cb3fdee2adc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.744798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njvhg\" (UniqueName: \"kubernetes.io/projected/4d36e906-0206-4cda-913f-9cb3fdee2adc-kube-api-access-njvhg\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.795425 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dc98bb458-s9tqv"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.856828 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f99fddc9f-j57sh"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.880401 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.880485 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.880567 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.880602 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.880639 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d36e906-0206-4cda-913f-9cb3fdee2adc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.880727 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njvhg\" (UniqueName: \"kubernetes.io/projected/4d36e906-0206-4cda-913f-9cb3fdee2adc-kube-api-access-njvhg\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.921531 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c988b84c4-q8wdn"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.921734 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d36e906-0206-4cda-913f-9cb3fdee2adc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.948245 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.954162 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b5964cd58-f9xhr"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.970486 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.970881 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njvhg\" (UniqueName: \"kubernetes.io/projected/4d36e906-0206-4cda-913f-9cb3fdee2adc-kube-api-access-njvhg\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.970947 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.971263 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.984253 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8qhgf"] Jan 05 14:09:33 crc kubenswrapper[4740]: I0105 14:09:33.987606 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.008979 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.065000 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mskbh"] Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.091612 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8qhgf"] Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.107655 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-svc\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.107726 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-config\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.107809 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.108033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv8hd\" (UniqueName: \"kubernetes.io/projected/cf33de39-642c-4ea4-8b3b-e505d43016f8-kube-api-access-tv8hd\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.108129 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.108191 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.132540 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.134582 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.145354 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.157536 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.174271 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210118 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv8hd\" (UniqueName: \"kubernetes.io/projected/cf33de39-642c-4ea4-8b3b-e505d43016f8-kube-api-access-tv8hd\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210184 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210217 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210260 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210300 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data-custom\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210320 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210356 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-svc\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210370 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-scripts\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210405 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-config\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210424 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8e0a645-f5dd-4e8f-a166-310bec2301cd-logs\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210461 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2s8\" (UniqueName: \"kubernetes.io/projected/b8e0a645-f5dd-4e8f-a166-310bec2301cd-kube-api-access-sn2s8\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210495 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.210517 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8e0a645-f5dd-4e8f-a166-310bec2301cd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.211855 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.211861 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-config\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.212190 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-svc\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.212434 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.212844 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.277369 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv8hd\" (UniqueName: \"kubernetes.io/projected/cf33de39-642c-4ea4-8b3b-e505d43016f8-kube-api-access-tv8hd\") pod \"dnsmasq-dns-6578955fd5-8qhgf\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.312985 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313108 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data-custom\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313131 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313155 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-scripts\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313195 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8e0a645-f5dd-4e8f-a166-310bec2301cd-logs\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313230 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2s8\" (UniqueName: \"kubernetes.io/projected/b8e0a645-f5dd-4e8f-a166-310bec2301cd-kube-api-access-sn2s8\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313270 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8e0a645-f5dd-4e8f-a166-310bec2301cd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.313351 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8e0a645-f5dd-4e8f-a166-310bec2301cd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.322573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8e0a645-f5dd-4e8f-a166-310bec2301cd-logs\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.327985 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.333438 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-scripts\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.333649 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.336578 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data-custom\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.343240 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2s8\" (UniqueName: \"kubernetes.io/projected/b8e0a645-f5dd-4e8f-a166-310bec2301cd-kube-api-access-sn2s8\") pod \"cinder-api-0\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.407649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f99fddc9f-j57sh" event={"ID":"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0","Type":"ContainerStarted","Data":"19bc8e3c830a94ca8fdb9eef1c6599745c8cd0968174cff2da9df6d3f7b91e3c"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.409938 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5964cd58-f9xhr" event={"ID":"0001a65f-37dd-467e-b526-921e55a3152f","Type":"ContainerStarted","Data":"fdb36602ea47579e41bc0279dd21984cc7938765440cf500fcc0919216afdc20"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.411525 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" event={"ID":"e67f8caf-f4af-41a6-bb90-07687d1e2c70","Type":"ContainerStarted","Data":"a61c882238f037f2b5e8aef6ce5659d04d8fdf15963a6f173e8d7eb314febdf2"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.413683 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" event={"ID":"98684691-8ef2-4cb5-85e2-fe5913a5b3c0","Type":"ContainerStarted","Data":"3720603a8b832465f714ad3ee4b6b243d5f374522a9360b8842e72ea4c97ce89"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.419259 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc98bb458-s9tqv" event={"ID":"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96","Type":"ContainerStarted","Data":"d8b4c9a6161277aa73d16c38060b0adf53d5aef82d9583f5d5e53e038b4f43f4"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.434972 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.441012 4740 generic.go:334] "Generic (PLEG): container finished" podID="425802c2-2528-451d-960a-24372126b18c" containerID="43b6b7c4584a0ece7548024dea78025680a0020db633f2707fc2e8fa25ba7e31" exitCode=0 Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.441173 4740 generic.go:334] "Generic (PLEG): container finished" podID="425802c2-2528-451d-960a-24372126b18c" containerID="c0da9ed00a6d44e62107d51f61fe908b308e041d3d119c398aa18734f6abff84" exitCode=2 Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.441234 4740 generic.go:334] "Generic (PLEG): container finished" podID="425802c2-2528-451d-960a-24372126b18c" containerID="8e0d5775dc01bfc889f50d3f81e383edcfed499c68ec42c22984729c7a5e0863" exitCode=0 Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.441446 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerDied","Data":"43b6b7c4584a0ece7548024dea78025680a0020db633f2707fc2e8fa25ba7e31"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.441606 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerDied","Data":"c0da9ed00a6d44e62107d51f61fe908b308e041d3d119c398aa18734f6abff84"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.441699 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerDied","Data":"8e0d5775dc01bfc889f50d3f81e383edcfed499c68ec42c22984729c7a5e0863"} Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.584321 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 05 14:09:34 crc kubenswrapper[4740]: I0105 14:09:34.864834 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:34 crc kubenswrapper[4740]: W0105 14:09:34.927238 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d36e906_0206_4cda_913f_9cb3fdee2adc.slice/crio-4c7e4dba77658e0357413efc8a7be5910be9a38986d938df32f526a58b01fd09 WatchSource:0}: Error finding container 4c7e4dba77658e0357413efc8a7be5910be9a38986d938df32f526a58b01fd09: Status 404 returned error can't find the container with id 4c7e4dba77658e0357413efc8a7be5910be9a38986d938df32f526a58b01fd09 Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.066227 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8qhgf"] Jan 05 14:09:35 crc kubenswrapper[4740]: W0105 14:09:35.101023 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf33de39_642c_4ea4_8b3b_e505d43016f8.slice/crio-ce478ddcede8b3f40ac08540280df66fd2b97b4a4ac81f064f25c50e91e7555b WatchSource:0}: Error finding container ce478ddcede8b3f40ac08540280df66fd2b97b4a4ac81f064f25c50e91e7555b: Status 404 returned error can't find the container with id ce478ddcede8b3f40ac08540280df66fd2b97b4a4ac81f064f25c50e91e7555b Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.332322 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.462319 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d36e906-0206-4cda-913f-9cb3fdee2adc","Type":"ContainerStarted","Data":"4c7e4dba77658e0357413efc8a7be5910be9a38986d938df32f526a58b01fd09"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.468238 4740 generic.go:334] "Generic (PLEG): container finished" podID="e67f8caf-f4af-41a6-bb90-07687d1e2c70" containerID="f6495158ca2d64a945245b37e4778f94952dfcfff7c760f99b9fa8a81a932439" exitCode=0 Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.468282 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" event={"ID":"e67f8caf-f4af-41a6-bb90-07687d1e2c70","Type":"ContainerDied","Data":"f6495158ca2d64a945245b37e4778f94952dfcfff7c760f99b9fa8a81a932439"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.472013 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc98bb458-s9tqv" event={"ID":"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96","Type":"ContainerStarted","Data":"984cac5d08b74cdcd28d730a9e12568f729deeb00d6bd947a5a3d390f013454a"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.472043 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dc98bb458-s9tqv" event={"ID":"1f5ea36a-57ff-4d4e-ac3d-914d2278cd96","Type":"ContainerStarted","Data":"72a48add086851f0512c1e67c0df21301395bcaf2edb88a44530b11de9d8f83d"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.473015 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.473048 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.474630 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" event={"ID":"cf33de39-642c-4ea4-8b3b-e505d43016f8","Type":"ContainerStarted","Data":"ce478ddcede8b3f40ac08540280df66fd2b97b4a4ac81f064f25c50e91e7555b"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.483037 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5964cd58-f9xhr" event={"ID":"0001a65f-37dd-467e-b526-921e55a3152f","Type":"ContainerStarted","Data":"f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.483079 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5964cd58-f9xhr" event={"ID":"0001a65f-37dd-467e-b526-921e55a3152f","Type":"ContainerStarted","Data":"d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640"} Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.483776 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.483804 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.533054 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b5964cd58-f9xhr" podStartSLOduration=8.533031246 podStartE2EDuration="8.533031246s" podCreationTimestamp="2026-01-05 14:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:35.516492793 +0000 UTC m=+1224.823401372" watchObservedRunningTime="2026-01-05 14:09:35.533031246 +0000 UTC m=+1224.839939825" Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.551111 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dc98bb458-s9tqv" podStartSLOduration=5.551095801 podStartE2EDuration="5.551095801s" podCreationTimestamp="2026-01-05 14:09:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:35.54585828 +0000 UTC m=+1224.852766859" watchObservedRunningTime="2026-01-05 14:09:35.551095801 +0000 UTC m=+1224.858004380" Jan 05 14:09:35 crc kubenswrapper[4740]: I0105 14:09:35.664302 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.422449 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5797c5bd9-frslw" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.435633 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.471733 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-config\") pod \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.471856 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq722\" (UniqueName: \"kubernetes.io/projected/e67f8caf-f4af-41a6-bb90-07687d1e2c70-kube-api-access-wq722\") pod \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.471907 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-swift-storage-0\") pod \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.472047 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-sb\") pod \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.472142 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-svc\") pod \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.472190 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-nb\") pod \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\" (UID: \"e67f8caf-f4af-41a6-bb90-07687d1e2c70\") " Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.543753 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67f8caf-f4af-41a6-bb90-07687d1e2c70-kube-api-access-wq722" (OuterVolumeSpecName: "kube-api-access-wq722") pod "e67f8caf-f4af-41a6-bb90-07687d1e2c70" (UID: "e67f8caf-f4af-41a6-bb90-07687d1e2c70"). InnerVolumeSpecName "kube-api-access-wq722". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.545586 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e67f8caf-f4af-41a6-bb90-07687d1e2c70" (UID: "e67f8caf-f4af-41a6-bb90-07687d1e2c70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.545779 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e67f8caf-f4af-41a6-bb90-07687d1e2c70" (UID: "e67f8caf-f4af-41a6-bb90-07687d1e2c70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.549524 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b8e0a645-f5dd-4e8f-a166-310bec2301cd","Type":"ContainerStarted","Data":"d23047f23e56252595ac5618a8d6a0bb78eaf058aca74dbb79a1be696e36ba5d"} Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.562700 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e67f8caf-f4af-41a6-bb90-07687d1e2c70" (UID: "e67f8caf-f4af-41a6-bb90-07687d1e2c70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.580730 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq722\" (UniqueName: \"kubernetes.io/projected/e67f8caf-f4af-41a6-bb90-07687d1e2c70-kube-api-access-wq722\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.580762 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.580771 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.580782 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.621507 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" event={"ID":"e67f8caf-f4af-41a6-bb90-07687d1e2c70","Type":"ContainerDied","Data":"a61c882238f037f2b5e8aef6ce5659d04d8fdf15963a6f173e8d7eb314febdf2"} Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.621565 4740 scope.go:117] "RemoveContainer" containerID="f6495158ca2d64a945245b37e4778f94952dfcfff7c760f99b9fa8a81a932439" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.621700 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mskbh" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.715698 4740 generic.go:334] "Generic (PLEG): container finished" podID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerID="8c90290b7aa487c361dec58ddaf0c839a114359312b9d1c83ec748212c1f3b0b" exitCode=0 Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.716784 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" event={"ID":"cf33de39-642c-4ea4-8b3b-e505d43016f8","Type":"ContainerDied","Data":"8c90290b7aa487c361dec58ddaf0c839a114359312b9d1c83ec748212c1f3b0b"} Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.751140 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-config" (OuterVolumeSpecName: "config") pod "e67f8caf-f4af-41a6-bb90-07687d1e2c70" (UID: "e67f8caf-f4af-41a6-bb90-07687d1e2c70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.751242 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56585fbbf8-r9sc8"] Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.751535 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56585fbbf8-r9sc8" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-api" containerID="cri-o://ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590" gracePeriod=30 Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.751745 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56585fbbf8-r9sc8" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-httpd" containerID="cri-o://137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364" gracePeriod=30 Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.796953 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e67f8caf-f4af-41a6-bb90-07687d1e2c70" (UID: "e67f8caf-f4af-41a6-bb90-07687d1e2c70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.819717 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:36 crc kubenswrapper[4740]: I0105 14:09:36.819746 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e67f8caf-f4af-41a6-bb90-07687d1e2c70-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.003521 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mskbh"] Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.010438 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mskbh"] Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.731265 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" event={"ID":"98684691-8ef2-4cb5-85e2-fe5913a5b3c0","Type":"ContainerStarted","Data":"47d5755b105d825538690159e83ce3f456d5a02f3215047ee8e1063580740b4a"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.733652 4740 generic.go:334] "Generic (PLEG): container finished" podID="390a9292-95a7-457f-b300-e8613949150c" containerID="137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364" exitCode=0 Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.733684 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56585fbbf8-r9sc8" event={"ID":"390a9292-95a7-457f-b300-e8613949150c","Type":"ContainerDied","Data":"137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.741095 4740 generic.go:334] "Generic (PLEG): container finished" podID="425802c2-2528-451d-960a-24372126b18c" containerID="e59bce26fca4df9150498bb6e10543a8675137393eff56d64d9ec6771bc0ceb4" exitCode=0 Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.741239 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerDied","Data":"e59bce26fca4df9150498bb6e10543a8675137393eff56d64d9ec6771bc0ceb4"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.741299 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"425802c2-2528-451d-960a-24372126b18c","Type":"ContainerDied","Data":"529ef03720f0381c621357a611ff36271476c32c5676f8a354e7e2e39feefad7"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.741316 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="529ef03720f0381c621357a611ff36271476c32c5676f8a354e7e2e39feefad7" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.746857 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f99fddc9f-j57sh" event={"ID":"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0","Type":"ContainerStarted","Data":"2db7fe9750d911074130125aab0a1219b6cc3e3507dbc5c4f456d3b84741e528"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.746911 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f99fddc9f-j57sh" event={"ID":"6e339bc8-cbcb-48ac-84bd-ad37fcd552c0","Type":"ContainerStarted","Data":"24ca4220f8d7e8775bf7c2f04efeb0976598b679f0bc751c099587e1cc37f93d"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.748543 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.749839 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" event={"ID":"cf33de39-642c-4ea4-8b3b-e505d43016f8","Type":"ContainerStarted","Data":"37113591b12f42e7504421b7b942eb220d3cb9518e7bc8bd8578e9b1b53a28ac"} Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.769277 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7f99fddc9f-j57sh" podStartSLOduration=7.980844432 podStartE2EDuration="10.769159404s" podCreationTimestamp="2026-01-05 14:09:27 +0000 UTC" firstStartedPulling="2026-01-05 14:09:33.812323657 +0000 UTC m=+1223.119232236" lastFinishedPulling="2026-01-05 14:09:36.600638639 +0000 UTC m=+1225.907547208" observedRunningTime="2026-01-05 14:09:37.767309825 +0000 UTC m=+1227.074218404" watchObservedRunningTime="2026-01-05 14:09:37.769159404 +0000 UTC m=+1227.076067983" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.853733 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" podStartSLOduration=4.8537051 podStartE2EDuration="4.8537051s" podCreationTimestamp="2026-01-05 14:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:37.851036149 +0000 UTC m=+1227.157944738" watchObservedRunningTime="2026-01-05 14:09:37.8537051 +0000 UTC m=+1227.160613689" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.954534 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-run-httpd\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.954909 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-combined-ca-bundle\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.954943 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-scripts\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.955019 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hgc2\" (UniqueName: \"kubernetes.io/projected/425802c2-2528-451d-960a-24372126b18c-kube-api-access-6hgc2\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.955055 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-config-data\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.955158 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-log-httpd\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.955221 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.955369 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-sg-core-conf-yaml\") pod \"425802c2-2528-451d-960a-24372126b18c\" (UID: \"425802c2-2528-451d-960a-24372126b18c\") " Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.956100 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.957865 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.963425 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425802c2-2528-451d-960a-24372126b18c-kube-api-access-6hgc2" (OuterVolumeSpecName: "kube-api-access-6hgc2") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "kube-api-access-6hgc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:37 crc kubenswrapper[4740]: I0105 14:09:37.964348 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-scripts" (OuterVolumeSpecName: "scripts") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.024218 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.058962 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/425802c2-2528-451d-960a-24372126b18c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.058991 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.059001 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.059012 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hgc2\" (UniqueName: \"kubernetes.io/projected/425802c2-2528-451d-960a-24372126b18c-kube-api-access-6hgc2\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.070582 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.120705 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-config-data" (OuterVolumeSpecName: "config-data") pod "425802c2-2528-451d-960a-24372126b18c" (UID: "425802c2-2528-451d-960a-24372126b18c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.161601 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.161633 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425802c2-2528-451d-960a-24372126b18c-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.763950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d36e906-0206-4cda-913f-9cb3fdee2adc","Type":"ContainerStarted","Data":"f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c"} Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.764240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d36e906-0206-4cda-913f-9cb3fdee2adc","Type":"ContainerStarted","Data":"b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da"} Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.768326 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b8e0a645-f5dd-4e8f-a166-310bec2301cd","Type":"ContainerStarted","Data":"e9cc1482edcbcb82406ec9648c0304d77917f947a6470b9aa5aadad1df652348"} Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.768360 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b8e0a645-f5dd-4e8f-a166-310bec2301cd","Type":"ContainerStarted","Data":"f9df0a1efe8e1fd5aa6bbc40c927e8d8827eea01e60c3d66bf2daaa80251bcdd"} Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.768442 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api-log" containerID="cri-o://f9df0a1efe8e1fd5aa6bbc40c927e8d8827eea01e60c3d66bf2daaa80251bcdd" gracePeriod=30 Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.768516 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.769736 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api" containerID="cri-o://e9cc1482edcbcb82406ec9648c0304d77917f947a6470b9aa5aadad1df652348" gracePeriod=30 Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.774602 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" event={"ID":"98684691-8ef2-4cb5-85e2-fe5913a5b3c0","Type":"ContainerStarted","Data":"e3d131cee84bf359e4bbc66d780fd7da4bb128e1b6a7a592e254a75cceb58883"} Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.775351 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.775431 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.795929 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.066725965 podStartE2EDuration="5.795908242s" podCreationTimestamp="2026-01-05 14:09:33 +0000 UTC" firstStartedPulling="2026-01-05 14:09:34.912088551 +0000 UTC m=+1224.218997130" lastFinishedPulling="2026-01-05 14:09:36.641270838 +0000 UTC m=+1225.948179407" observedRunningTime="2026-01-05 14:09:38.783279064 +0000 UTC m=+1228.090187643" watchObservedRunningTime="2026-01-05 14:09:38.795908242 +0000 UTC m=+1228.102816821" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.820891 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.820869311 podStartE2EDuration="5.820869311s" podCreationTimestamp="2026-01-05 14:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:38.812348993 +0000 UTC m=+1228.119257572" watchObservedRunningTime="2026-01-05 14:09:38.820869311 +0000 UTC m=+1228.127777890" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.829933 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c988b84c4-q8wdn" podStartSLOduration=9.207928207 podStartE2EDuration="11.829914443s" podCreationTimestamp="2026-01-05 14:09:27 +0000 UTC" firstStartedPulling="2026-01-05 14:09:33.978319904 +0000 UTC m=+1223.285228483" lastFinishedPulling="2026-01-05 14:09:36.60030614 +0000 UTC m=+1225.907214719" observedRunningTime="2026-01-05 14:09:38.82794765 +0000 UTC m=+1228.134856239" watchObservedRunningTime="2026-01-05 14:09:38.829914443 +0000 UTC m=+1228.136823022" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.990958 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67f8caf-f4af-41a6-bb90-07687d1e2c70" path="/var/lib/kubelet/pods/e67f8caf-f4af-41a6-bb90-07687d1e2c70/volumes" Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.991850 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:09:38 crc kubenswrapper[4740]: I0105 14:09:38.994819 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.002685 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:09:39 crc kubenswrapper[4740]: E0105 14:09:39.003270 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="sg-core" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003288 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="sg-core" Jan 05 14:09:39 crc kubenswrapper[4740]: E0105 14:09:39.003309 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-notification-agent" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003316 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-notification-agent" Jan 05 14:09:39 crc kubenswrapper[4740]: E0105 14:09:39.003333 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-central-agent" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003339 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-central-agent" Jan 05 14:09:39 crc kubenswrapper[4740]: E0105 14:09:39.003364 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="proxy-httpd" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003370 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="proxy-httpd" Jan 05 14:09:39 crc kubenswrapper[4740]: E0105 14:09:39.003381 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67f8caf-f4af-41a6-bb90-07687d1e2c70" containerName="init" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003388 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67f8caf-f4af-41a6-bb90-07687d1e2c70" containerName="init" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003665 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="proxy-httpd" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003686 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-notification-agent" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003703 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="ceilometer-central-agent" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003722 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="425802c2-2528-451d-960a-24372126b18c" containerName="sg-core" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.003735 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67f8caf-f4af-41a6-bb90-07687d1e2c70" containerName="init" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.005949 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.010820 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.011230 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.011512 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.012891 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.103316 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-scripts\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.103590 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.103645 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-config-data\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.103872 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhfj\" (UniqueName: \"kubernetes.io/projected/953b81f2-4ef4-4837-ae64-fed14572e2e6-kube-api-access-5zhfj\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.103978 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-log-httpd\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.104014 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.104153 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-run-httpd\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206432 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-run-httpd\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206534 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-scripts\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206657 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206680 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-config-data\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206739 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhfj\" (UniqueName: \"kubernetes.io/projected/953b81f2-4ef4-4837-ae64-fed14572e2e6-kube-api-access-5zhfj\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206773 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-log-httpd\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.206795 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.207583 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-run-httpd\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.208462 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-log-httpd\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.213474 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-config-data\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.214828 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.215887 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-scripts\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.217803 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.227975 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhfj\" (UniqueName: \"kubernetes.io/projected/953b81f2-4ef4-4837-ae64-fed14572e2e6-kube-api-access-5zhfj\") pod \"ceilometer-0\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.338381 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.795725 4740 generic.go:334] "Generic (PLEG): container finished" podID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerID="f9df0a1efe8e1fd5aa6bbc40c927e8d8827eea01e60c3d66bf2daaa80251bcdd" exitCode=143 Jan 05 14:09:39 crc kubenswrapper[4740]: I0105 14:09:39.796815 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b8e0a645-f5dd-4e8f-a166-310bec2301cd","Type":"ContainerDied","Data":"f9df0a1efe8e1fd5aa6bbc40c927e8d8827eea01e60c3d66bf2daaa80251bcdd"} Jan 05 14:09:40 crc kubenswrapper[4740]: I0105 14:09:40.017949 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:09:40 crc kubenswrapper[4740]: I0105 14:09:40.825481 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerStarted","Data":"0bce1d13410bfc8006fdec2aeba57752cbad00b9240c1ffdac87df24e1e32f1e"} Jan 05 14:09:40 crc kubenswrapper[4740]: I0105 14:09:40.985403 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425802c2-2528-451d-960a-24372126b18c" path="/var/lib/kubelet/pods/425802c2-2528-451d-960a-24372126b18c/volumes" Jan 05 14:09:42 crc kubenswrapper[4740]: I0105 14:09:42.846863 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerStarted","Data":"82e8e6b99c9f49bdc55592fe4b03219dc6cc593a93cc72d1ab2c5aa24c9cd0c6"} Jan 05 14:09:42 crc kubenswrapper[4740]: I0105 14:09:42.888576 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:42 crc kubenswrapper[4740]: I0105 14:09:42.909347 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5dc98bb458-s9tqv" podUID="1f5ea36a-57ff-4d4e-ac3d-914d2278cd96" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 14:09:43 crc kubenswrapper[4740]: I0105 14:09:43.856909 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerStarted","Data":"3b029f5ff7dd3df43796804a16b4a895b0070043a317dddcfa20e34cd7d883e2"} Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.317878 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.383091 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.437561 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.516913 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t7gd6"] Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.517150 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerName="dnsmasq-dns" containerID="cri-o://d9c76e366cf9c1556f910fbbc9ec510551a00adbb450d64e9c6f7af3c0e9d11b" gracePeriod=10 Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.891745 4740 generic.go:334] "Generic (PLEG): container finished" podID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerID="d9c76e366cf9c1556f910fbbc9ec510551a00adbb450d64e9c6f7af3c0e9d11b" exitCode=0 Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.891836 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" event={"ID":"93830ee5-2e0a-4f1e-9234-fa45767e0391","Type":"ContainerDied","Data":"d9c76e366cf9c1556f910fbbc9ec510551a00adbb450d64e9c6f7af3c0e9d11b"} Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.909620 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="cinder-scheduler" containerID="cri-o://b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da" gracePeriod=30 Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.909757 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerStarted","Data":"237056f5c7edf54c862457bc646fed0e3f26ddd280d0a004c8c8ae6712b692df"} Jan 05 14:09:44 crc kubenswrapper[4740]: I0105 14:09:44.910164 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="probe" containerID="cri-o://f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c" gracePeriod=30 Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.072648 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.072901 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dc98bb458-s9tqv" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.225830 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b5964cd58-f9xhr"] Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.408379 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:45 crc kubenswrapper[4740]: E0105 14:09:45.453550 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod390a9292_95a7_457f_b300_e8613949150c.slice/crio-conmon-ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod390a9292_95a7_457f_b300_e8613949150c.slice/crio-ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.469686 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.584176 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-sb\") pod \"93830ee5-2e0a-4f1e-9234-fa45767e0391\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.584249 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-config\") pod \"93830ee5-2e0a-4f1e-9234-fa45767e0391\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.584340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-nb\") pod \"93830ee5-2e0a-4f1e-9234-fa45767e0391\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.584396 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhkww\" (UniqueName: \"kubernetes.io/projected/93830ee5-2e0a-4f1e-9234-fa45767e0391-kube-api-access-dhkww\") pod \"93830ee5-2e0a-4f1e-9234-fa45767e0391\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.584456 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-swift-storage-0\") pod \"93830ee5-2e0a-4f1e-9234-fa45767e0391\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.584483 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-svc\") pod \"93830ee5-2e0a-4f1e-9234-fa45767e0391\" (UID: \"93830ee5-2e0a-4f1e-9234-fa45767e0391\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.612004 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93830ee5-2e0a-4f1e-9234-fa45767e0391-kube-api-access-dhkww" (OuterVolumeSpecName: "kube-api-access-dhkww") pod "93830ee5-2e0a-4f1e-9234-fa45767e0391" (UID: "93830ee5-2e0a-4f1e-9234-fa45767e0391"). InnerVolumeSpecName "kube-api-access-dhkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.690747 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93830ee5-2e0a-4f1e-9234-fa45767e0391" (UID: "93830ee5-2e0a-4f1e-9234-fa45767e0391"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.691789 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhkww\" (UniqueName: \"kubernetes.io/projected/93830ee5-2e0a-4f1e-9234-fa45767e0391-kube-api-access-dhkww\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.705228 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93830ee5-2e0a-4f1e-9234-fa45767e0391" (UID: "93830ee5-2e0a-4f1e-9234-fa45767e0391"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.706821 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93830ee5-2e0a-4f1e-9234-fa45767e0391" (UID: "93830ee5-2e0a-4f1e-9234-fa45767e0391"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.772382 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-config" (OuterVolumeSpecName: "config") pod "93830ee5-2e0a-4f1e-9234-fa45767e0391" (UID: "93830ee5-2e0a-4f1e-9234-fa45767e0391"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.772960 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.794872 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.794905 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.794914 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.794925 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.837412 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93830ee5-2e0a-4f1e-9234-fa45767e0391" (UID: "93830ee5-2e0a-4f1e-9234-fa45767e0391"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.895580 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4smpw\" (UniqueName: \"kubernetes.io/projected/390a9292-95a7-457f-b300-e8613949150c-kube-api-access-4smpw\") pod \"390a9292-95a7-457f-b300-e8613949150c\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.895631 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-ovndb-tls-certs\") pod \"390a9292-95a7-457f-b300-e8613949150c\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.895861 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-combined-ca-bundle\") pod \"390a9292-95a7-457f-b300-e8613949150c\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.895957 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-httpd-config\") pod \"390a9292-95a7-457f-b300-e8613949150c\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.896042 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-config\") pod \"390a9292-95a7-457f-b300-e8613949150c\" (UID: \"390a9292-95a7-457f-b300-e8613949150c\") " Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.896834 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93830ee5-2e0a-4f1e-9234-fa45767e0391-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.899204 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390a9292-95a7-457f-b300-e8613949150c-kube-api-access-4smpw" (OuterVolumeSpecName: "kube-api-access-4smpw") pod "390a9292-95a7-457f-b300-e8613949150c" (UID: "390a9292-95a7-457f-b300-e8613949150c"). InnerVolumeSpecName "kube-api-access-4smpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.899762 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "390a9292-95a7-457f-b300-e8613949150c" (UID: "390a9292-95a7-457f-b300-e8613949150c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.924425 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerStarted","Data":"39814a101297bf5534d31b9ad43153757f28a60b5ac6a6f04faf202643b2b7ec"} Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.925009 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.927089 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" event={"ID":"93830ee5-2e0a-4f1e-9234-fa45767e0391","Type":"ContainerDied","Data":"7818b658e70eb1602346b40e33e36f91fca67a1aa62518aa0f524280a0430bd2"} Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.927140 4740 scope.go:117] "RemoveContainer" containerID="d9c76e366cf9c1556f910fbbc9ec510551a00adbb450d64e9c6f7af3c0e9d11b" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.927328 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t7gd6" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.931557 4740 generic.go:334] "Generic (PLEG): container finished" podID="390a9292-95a7-457f-b300-e8613949150c" containerID="ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590" exitCode=0 Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.931622 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56585fbbf8-r9sc8" event={"ID":"390a9292-95a7-457f-b300-e8613949150c","Type":"ContainerDied","Data":"ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590"} Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.931649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56585fbbf8-r9sc8" event={"ID":"390a9292-95a7-457f-b300-e8613949150c","Type":"ContainerDied","Data":"07c5af9492de0374a0a283b8b29fbb1518c64d8ffcf62d7bc3f0a23e29a7a199"} Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.931708 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56585fbbf8-r9sc8" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.935489 4740 generic.go:334] "Generic (PLEG): container finished" podID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerID="f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c" exitCode=0 Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.935671 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b5964cd58-f9xhr" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api-log" containerID="cri-o://d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640" gracePeriod=30 Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.935903 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d36e906-0206-4cda-913f-9cb3fdee2adc","Type":"ContainerDied","Data":"f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c"} Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.935959 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b5964cd58-f9xhr" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api" containerID="cri-o://f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a" gracePeriod=30 Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.941214 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b5964cd58-f9xhr" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.208:9311/healthcheck\": EOF" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.979341 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.528112208 podStartE2EDuration="7.979321421s" podCreationTimestamp="2026-01-05 14:09:38 +0000 UTC" firstStartedPulling="2026-01-05 14:09:40.030257182 +0000 UTC m=+1229.337165761" lastFinishedPulling="2026-01-05 14:09:45.481466395 +0000 UTC m=+1234.788374974" observedRunningTime="2026-01-05 14:09:45.952724738 +0000 UTC m=+1235.259633317" watchObservedRunningTime="2026-01-05 14:09:45.979321421 +0000 UTC m=+1235.286230000" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.987379 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "390a9292-95a7-457f-b300-e8613949150c" (UID: "390a9292-95a7-457f-b300-e8613949150c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.996215 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-config" (OuterVolumeSpecName: "config") pod "390a9292-95a7-457f-b300-e8613949150c" (UID: "390a9292-95a7-457f-b300-e8613949150c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.999741 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4smpw\" (UniqueName: \"kubernetes.io/projected/390a9292-95a7-457f-b300-e8613949150c-kube-api-access-4smpw\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.999778 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.999788 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:45 crc kubenswrapper[4740]: I0105 14:09:45.999798 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.002584 4740 scope.go:117] "RemoveContainer" containerID="01962e81caa687ad962cda09a9a7505a3758222384270ac679b42c7797cb961d" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.006341 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t7gd6"] Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.018646 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t7gd6"] Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.023732 4740 scope.go:117] "RemoveContainer" containerID="137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.023754 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "390a9292-95a7-457f-b300-e8613949150c" (UID: "390a9292-95a7-457f-b300-e8613949150c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.089083 4740 scope.go:117] "RemoveContainer" containerID="ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.105095 4740 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/390a9292-95a7-457f-b300-e8613949150c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.123497 4740 scope.go:117] "RemoveContainer" containerID="137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364" Jan 05 14:09:46 crc kubenswrapper[4740]: E0105 14:09:46.127255 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364\": container with ID starting with 137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364 not found: ID does not exist" containerID="137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.127298 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364"} err="failed to get container status \"137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364\": rpc error: code = NotFound desc = could not find container \"137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364\": container with ID starting with 137bf2671fc88d1ff2344095ede9e68e7527f972709822620a3f3ebba1b00364 not found: ID does not exist" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.127329 4740 scope.go:117] "RemoveContainer" containerID="ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590" Jan 05 14:09:46 crc kubenswrapper[4740]: E0105 14:09:46.132131 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590\": container with ID starting with ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590 not found: ID does not exist" containerID="ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.132155 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590"} err="failed to get container status \"ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590\": rpc error: code = NotFound desc = could not find container \"ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590\": container with ID starting with ab8c29f10fda2f89b91cf9ecb7ef32c2511723b4806eedf1822931c6947be590 not found: ID does not exist" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.291187 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56585fbbf8-r9sc8"] Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.311187 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56585fbbf8-r9sc8"] Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.479330 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.616389 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-combined-ca-bundle\") pod \"4d36e906-0206-4cda-913f-9cb3fdee2adc\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.616761 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njvhg\" (UniqueName: \"kubernetes.io/projected/4d36e906-0206-4cda-913f-9cb3fdee2adc-kube-api-access-njvhg\") pod \"4d36e906-0206-4cda-913f-9cb3fdee2adc\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.616838 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-scripts\") pod \"4d36e906-0206-4cda-913f-9cb3fdee2adc\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.616940 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data\") pod \"4d36e906-0206-4cda-913f-9cb3fdee2adc\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.616959 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data-custom\") pod \"4d36e906-0206-4cda-913f-9cb3fdee2adc\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.617514 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d36e906-0206-4cda-913f-9cb3fdee2adc-etc-machine-id\") pod \"4d36e906-0206-4cda-913f-9cb3fdee2adc\" (UID: \"4d36e906-0206-4cda-913f-9cb3fdee2adc\") " Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.618266 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d36e906-0206-4cda-913f-9cb3fdee2adc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4d36e906-0206-4cda-913f-9cb3fdee2adc" (UID: "4d36e906-0206-4cda-913f-9cb3fdee2adc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.626938 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-scripts" (OuterVolumeSpecName: "scripts") pod "4d36e906-0206-4cda-913f-9cb3fdee2adc" (UID: "4d36e906-0206-4cda-913f-9cb3fdee2adc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.628212 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d36e906-0206-4cda-913f-9cb3fdee2adc-kube-api-access-njvhg" (OuterVolumeSpecName: "kube-api-access-njvhg") pod "4d36e906-0206-4cda-913f-9cb3fdee2adc" (UID: "4d36e906-0206-4cda-913f-9cb3fdee2adc"). InnerVolumeSpecName "kube-api-access-njvhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.644344 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4d36e906-0206-4cda-913f-9cb3fdee2adc" (UID: "4d36e906-0206-4cda-913f-9cb3fdee2adc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.702310 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d36e906-0206-4cda-913f-9cb3fdee2adc" (UID: "4d36e906-0206-4cda-913f-9cb3fdee2adc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.721786 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.721815 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njvhg\" (UniqueName: \"kubernetes.io/projected/4d36e906-0206-4cda-913f-9cb3fdee2adc-kube-api-access-njvhg\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.721828 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.721836 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.721848 4740 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d36e906-0206-4cda-913f-9cb3fdee2adc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.759388 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data" (OuterVolumeSpecName: "config-data") pod "4d36e906-0206-4cda-913f-9cb3fdee2adc" (UID: "4d36e906-0206-4cda-913f-9cb3fdee2adc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.823598 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d36e906-0206-4cda-913f-9cb3fdee2adc-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.953970 4740 generic.go:334] "Generic (PLEG): container finished" podID="0001a65f-37dd-467e-b526-921e55a3152f" containerID="d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640" exitCode=143 Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.954035 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5964cd58-f9xhr" event={"ID":"0001a65f-37dd-467e-b526-921e55a3152f","Type":"ContainerDied","Data":"d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640"} Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.956294 4740 generic.go:334] "Generic (PLEG): container finished" podID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerID="b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da" exitCode=0 Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.956333 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d36e906-0206-4cda-913f-9cb3fdee2adc","Type":"ContainerDied","Data":"b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da"} Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.956352 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d36e906-0206-4cda-913f-9cb3fdee2adc","Type":"ContainerDied","Data":"4c7e4dba77658e0357413efc8a7be5910be9a38986d938df32f526a58b01fd09"} Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.956368 4740 scope.go:117] "RemoveContainer" containerID="f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.956455 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.984820 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390a9292-95a7-457f-b300-e8613949150c" path="/var/lib/kubelet/pods/390a9292-95a7-457f-b300-e8613949150c/volumes" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.985760 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" path="/var/lib/kubelet/pods/93830ee5-2e0a-4f1e-9234-fa45767e0391/volumes" Jan 05 14:09:46 crc kubenswrapper[4740]: I0105 14:09:46.994700 4740 scope.go:117] "RemoveContainer" containerID="b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.028303 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.058386 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.062479 4740 scope.go:117] "RemoveContainer" containerID="f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.062905 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c\": container with ID starting with f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c not found: ID does not exist" containerID="f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.062944 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c"} err="failed to get container status \"f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c\": rpc error: code = NotFound desc = could not find container \"f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c\": container with ID starting with f638b9bd2c9ad2fd0e633c117713c046a29eddee9bf24cf964af6cf00c08914c not found: ID does not exist" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.062969 4740 scope.go:117] "RemoveContainer" containerID="b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.063630 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da\": container with ID starting with b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da not found: ID does not exist" containerID="b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.063668 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da"} err="failed to get container status \"b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da\": rpc error: code = NotFound desc = could not find container \"b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da\": container with ID starting with b2c9746299cb8f1a295d4c0511005b4b76fe93851f4eaa94ab0de0359c3db9da not found: ID does not exist" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.080185 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.080924 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-api" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.080943 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-api" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.080955 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="cinder-scheduler" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.080962 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="cinder-scheduler" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.080981 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerName="init" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.080988 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerName="init" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.081001 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="probe" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081006 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="probe" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.081032 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerName="dnsmasq-dns" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081038 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerName="dnsmasq-dns" Jan 05 14:09:47 crc kubenswrapper[4740]: E0105 14:09:47.081058 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-httpd" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081080 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-httpd" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081285 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="93830ee5-2e0a-4f1e-9234-fa45767e0391" containerName="dnsmasq-dns" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081297 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-httpd" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081310 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="390a9292-95a7-457f-b300-e8613949150c" containerName="neutron-api" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081324 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="cinder-scheduler" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.081337 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" containerName="probe" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.083097 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.084825 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.091628 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.247108 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-scripts\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.247152 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/623b4799-3a7c-4cba-8632-420ef0704992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.247204 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-config-data\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.247221 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.247430 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jq2z\" (UniqueName: \"kubernetes.io/projected/623b4799-3a7c-4cba-8632-420ef0704992-kube-api-access-4jq2z\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.247487 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.349777 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jq2z\" (UniqueName: \"kubernetes.io/projected/623b4799-3a7c-4cba-8632-420ef0704992-kube-api-access-4jq2z\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.350010 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.350274 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-scripts\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.350350 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/623b4799-3a7c-4cba-8632-420ef0704992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.350507 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-config-data\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.350582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.350837 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/623b4799-3a7c-4cba-8632-420ef0704992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.354167 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.356388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-config-data\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.356876 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.361300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/623b4799-3a7c-4cba-8632-420ef0704992-scripts\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.367939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jq2z\" (UniqueName: \"kubernetes.io/projected/623b4799-3a7c-4cba-8632-420ef0704992-kube-api-access-4jq2z\") pod \"cinder-scheduler-0\" (UID: \"623b4799-3a7c-4cba-8632-420ef0704992\") " pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.404859 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.498899 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 05 14:09:47 crc kubenswrapper[4740]: W0105 14:09:47.943944 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod623b4799_3a7c_4cba_8632_420ef0704992.slice/crio-a344dce744c70253af6b3ac9186c4b357bc15ac18f8ac4b0e1cc948870de3878 WatchSource:0}: Error finding container a344dce744c70253af6b3ac9186c4b357bc15ac18f8ac4b0e1cc948870de3878: Status 404 returned error can't find the container with id a344dce744c70253af6b3ac9186c4b357bc15ac18f8ac4b0e1cc948870de3878 Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.949175 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 05 14:09:47 crc kubenswrapper[4740]: I0105 14:09:47.982290 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"623b4799-3a7c-4cba-8632-420ef0704992","Type":"ContainerStarted","Data":"a344dce744c70253af6b3ac9186c4b357bc15ac18f8ac4b0e1cc948870de3878"} Jan 05 14:09:48 crc kubenswrapper[4740]: I0105 14:09:48.984696 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d36e906-0206-4cda-913f-9cb3fdee2adc" path="/var/lib/kubelet/pods/4d36e906-0206-4cda-913f-9cb3fdee2adc/volumes" Jan 05 14:09:49 crc kubenswrapper[4740]: I0105 14:09:49.003156 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"623b4799-3a7c-4cba-8632-420ef0704992","Type":"ContainerStarted","Data":"5723b4f51c66bf05ad0b480f2a668450cf5919bea8d5559edf38b075fbf0d323"} Jan 05 14:09:50 crc kubenswrapper[4740]: I0105 14:09:50.015769 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"623b4799-3a7c-4cba-8632-420ef0704992","Type":"ContainerStarted","Data":"ad2b76c277caafea9b087977f77b317a2055793b6b81587b4dc5e7ed5ae4ce6c"} Jan 05 14:09:51 crc kubenswrapper[4740]: I0105 14:09:51.364375 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b5964cd58-f9xhr" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.208:9311/healthcheck\": read tcp 10.217.0.2:50076->10.217.0.208:9311: read: connection reset by peer" Jan 05 14:09:51 crc kubenswrapper[4740]: I0105 14:09:51.365557 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b5964cd58-f9xhr" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.208:9311/healthcheck\": read tcp 10.217.0.2:50080->10.217.0.208:9311: read: connection reset by peer" Jan 05 14:09:51 crc kubenswrapper[4740]: I0105 14:09:51.999825 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.028809 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.028788689 podStartE2EDuration="5.028788689s" podCreationTimestamp="2026-01-05 14:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:09:50.034513696 +0000 UTC m=+1239.341422295" watchObservedRunningTime="2026-01-05 14:09:52.028788689 +0000 UTC m=+1241.335697258" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.038444 4740 generic.go:334] "Generic (PLEG): container finished" podID="0001a65f-37dd-467e-b526-921e55a3152f" containerID="f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a" exitCode=0 Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.038683 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5964cd58-f9xhr" event={"ID":"0001a65f-37dd-467e-b526-921e55a3152f","Type":"ContainerDied","Data":"f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a"} Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.038763 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5964cd58-f9xhr" event={"ID":"0001a65f-37dd-467e-b526-921e55a3152f","Type":"ContainerDied","Data":"fdb36602ea47579e41bc0279dd21984cc7938765440cf500fcc0919216afdc20"} Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.038853 4740 scope.go:117] "RemoveContainer" containerID="f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.039055 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b5964cd58-f9xhr" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.067804 4740 scope.go:117] "RemoveContainer" containerID="d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.092506 4740 scope.go:117] "RemoveContainer" containerID="f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a" Jan 05 14:09:52 crc kubenswrapper[4740]: E0105 14:09:52.092925 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a\": container with ID starting with f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a not found: ID does not exist" containerID="f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.092963 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a"} err="failed to get container status \"f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a\": rpc error: code = NotFound desc = could not find container \"f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a\": container with ID starting with f73b3a56f73031c288ac64eb1d4c3460c582e44ec42cccfda475a0fbf27d985a not found: ID does not exist" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.092985 4740 scope.go:117] "RemoveContainer" containerID="d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640" Jan 05 14:09:52 crc kubenswrapper[4740]: E0105 14:09:52.093316 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640\": container with ID starting with d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640 not found: ID does not exist" containerID="d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.093341 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640"} err="failed to get container status \"d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640\": rpc error: code = NotFound desc = could not find container \"d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640\": container with ID starting with d16946a65075586e3e42c5b8b265a995510f8354749d788470545a423971f640 not found: ID does not exist" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.176449 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5s7q\" (UniqueName: \"kubernetes.io/projected/0001a65f-37dd-467e-b526-921e55a3152f-kube-api-access-p5s7q\") pod \"0001a65f-37dd-467e-b526-921e55a3152f\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.176518 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data\") pod \"0001a65f-37dd-467e-b526-921e55a3152f\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.176538 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-combined-ca-bundle\") pod \"0001a65f-37dd-467e-b526-921e55a3152f\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.176566 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0001a65f-37dd-467e-b526-921e55a3152f-logs\") pod \"0001a65f-37dd-467e-b526-921e55a3152f\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.176632 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data-custom\") pod \"0001a65f-37dd-467e-b526-921e55a3152f\" (UID: \"0001a65f-37dd-467e-b526-921e55a3152f\") " Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.177587 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0001a65f-37dd-467e-b526-921e55a3152f-logs" (OuterVolumeSpecName: "logs") pod "0001a65f-37dd-467e-b526-921e55a3152f" (UID: "0001a65f-37dd-467e-b526-921e55a3152f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.183950 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0001a65f-37dd-467e-b526-921e55a3152f-kube-api-access-p5s7q" (OuterVolumeSpecName: "kube-api-access-p5s7q") pod "0001a65f-37dd-467e-b526-921e55a3152f" (UID: "0001a65f-37dd-467e-b526-921e55a3152f"). InnerVolumeSpecName "kube-api-access-p5s7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.190645 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0001a65f-37dd-467e-b526-921e55a3152f" (UID: "0001a65f-37dd-467e-b526-921e55a3152f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.214852 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0001a65f-37dd-467e-b526-921e55a3152f" (UID: "0001a65f-37dd-467e-b526-921e55a3152f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.263022 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data" (OuterVolumeSpecName: "config-data") pod "0001a65f-37dd-467e-b526-921e55a3152f" (UID: "0001a65f-37dd-467e-b526-921e55a3152f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.279648 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5s7q\" (UniqueName: \"kubernetes.io/projected/0001a65f-37dd-467e-b526-921e55a3152f-kube-api-access-p5s7q\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.279695 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.279709 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.279721 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0001a65f-37dd-467e-b526-921e55a3152f-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.279733 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0001a65f-37dd-467e-b526-921e55a3152f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.383821 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b5964cd58-f9xhr"] Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.396857 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5b5964cd58-f9xhr"] Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.405299 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 05 14:09:52 crc kubenswrapper[4740]: I0105 14:09:52.993748 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0001a65f-37dd-467e-b526-921e55a3152f" path="/var/lib/kubelet/pods/0001a65f-37dd-467e-b526-921e55a3152f/volumes" Jan 05 14:09:53 crc kubenswrapper[4740]: I0105 14:09:53.184550 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:53 crc kubenswrapper[4740]: I0105 14:09:53.185939 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85c7d9bc6-f85sn" Jan 05 14:09:54 crc kubenswrapper[4740]: I0105 14:09:54.070728 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b7fccd64c-7x22h" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.852469 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 05 14:09:55 crc kubenswrapper[4740]: E0105 14:09:55.853206 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.853220 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api" Jan 05 14:09:55 crc kubenswrapper[4740]: E0105 14:09:55.853271 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api-log" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.853278 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api-log" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.853465 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api-log" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.853490 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0001a65f-37dd-467e-b526-921e55a3152f" containerName="barbican-api" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.854198 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.856414 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.857440 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s92kx" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.857561 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.874025 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.965804 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.965971 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrk98\" (UniqueName: \"kubernetes.io/projected/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-kube-api-access-hrk98\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.966075 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-openstack-config-secret\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:55 crc kubenswrapper[4740]: I0105 14:09:55.966112 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-openstack-config\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.068210 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-openstack-config-secret\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.069043 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-openstack-config\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.069566 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.070413 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrk98\" (UniqueName: \"kubernetes.io/projected/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-kube-api-access-hrk98\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.070430 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-openstack-config\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.074364 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-openstack-config-secret\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.074701 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.088247 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrk98\" (UniqueName: \"kubernetes.io/projected/6605f3c2-8a9f-45d4-9060-8e44e4bccac6-kube-api-access-hrk98\") pod \"openstackclient\" (UID: \"6605f3c2-8a9f-45d4-9060-8e44e4bccac6\") " pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.172603 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 05 14:09:56 crc kubenswrapper[4740]: I0105 14:09:56.696322 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 05 14:09:57 crc kubenswrapper[4740]: I0105 14:09:57.102611 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6605f3c2-8a9f-45d4-9060-8e44e4bccac6","Type":"ContainerStarted","Data":"5735ebdc1fac661f0dcf6b09e43fac2fc1f91d930f93195a05371430eca07612"} Jan 05 14:09:57 crc kubenswrapper[4740]: I0105 14:09:57.657565 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.289500 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7b9bfc84c-zlzbm"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.292122 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.295431 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.297360 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vhwxd" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.297571 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.321243 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.321296 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-combined-ca-bundle\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.321476 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsx2h\" (UniqueName: \"kubernetes.io/projected/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-kube-api-access-bsx2h\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.321503 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data-custom\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.322564 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b9bfc84c-zlzbm"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.424377 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsx2h\" (UniqueName: \"kubernetes.io/projected/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-kube-api-access-bsx2h\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.424415 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data-custom\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.424484 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.424507 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-combined-ca-bundle\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.438232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data-custom\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.453740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.468814 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsx2h\" (UniqueName: \"kubernetes.io/projected/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-kube-api-access-bsx2h\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.493865 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-combined-ca-bundle\") pod \"heat-engine-7b9bfc84c-zlzbm\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.615718 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.617537 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-dfp7f"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.620199 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.648712 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6c8776b56d-szfht"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.671502 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.678034 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-dfp7f"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.683711 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.694617 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c8776b56d-szfht"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.715591 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-d9fb66bc8-jrdhh"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.717983 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.723121 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.770540 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.770604 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.770921 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.770964 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdnn\" (UniqueName: \"kubernetes.io/projected/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-kube-api-access-dhdnn\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.776750 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.776814 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-config\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.845142 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d9fb66bc8-jrdhh"] Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.880029 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2mq\" (UniqueName: \"kubernetes.io/projected/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-kube-api-access-wr2mq\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.880834 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.880986 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881169 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdnn\" (UniqueName: \"kubernetes.io/projected/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-kube-api-access-dhdnn\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881262 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data-custom\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881431 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data-custom\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881526 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-combined-ca-bundle\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881633 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48j9l\" (UniqueName: \"kubernetes.io/projected/9c500176-40a9-42bd-83f6-1dc0df20484b-kube-api-access-48j9l\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881752 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881859 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-config\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.881937 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.882021 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.882262 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-combined-ca-bundle\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.882355 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.883696 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.884434 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.884910 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-config\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.885409 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.887778 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.921192 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.921244 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.941423 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdnn\" (UniqueName: \"kubernetes.io/projected/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-kube-api-access-dhdnn\") pod \"dnsmasq-dns-688b9f5b49-dfp7f\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.964930 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989400 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-combined-ca-bundle\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989454 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989518 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2mq\" (UniqueName: \"kubernetes.io/projected/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-kube-api-access-wr2mq\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989555 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989599 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data-custom\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989652 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data-custom\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989681 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-combined-ca-bundle\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.989719 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48j9l\" (UniqueName: \"kubernetes.io/projected/9c500176-40a9-42bd-83f6-1dc0df20484b-kube-api-access-48j9l\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.995678 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.999226 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data-custom\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:01 crc kubenswrapper[4740]: I0105 14:10:01.999469 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-combined-ca-bundle\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.002903 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-combined-ca-bundle\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.007939 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.008959 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data-custom\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.020524 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48j9l\" (UniqueName: \"kubernetes.io/projected/9c500176-40a9-42bd-83f6-1dc0df20484b-kube-api-access-48j9l\") pod \"heat-cfnapi-6c8776b56d-szfht\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.033729 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2mq\" (UniqueName: \"kubernetes.io/projected/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-kube-api-access-wr2mq\") pod \"heat-api-d9fb66bc8-jrdhh\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.111403 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.307953 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.443788 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7b9bfc84c-zlzbm"] Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.785902 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-dfp7f"] Jan 05 14:10:02 crc kubenswrapper[4740]: I0105 14:10:02.923189 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d9fb66bc8-jrdhh"] Jan 05 14:10:02 crc kubenswrapper[4740]: W0105 14:10:02.952979 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ace04b_27fb_40e6_9cee_6ab7d742cff7.slice/crio-f8217e03f88b13e3b3bc11620eb69f43e2862cfe334d3e6ca1606faa6c446e14 WatchSource:0}: Error finding container f8217e03f88b13e3b3bc11620eb69f43e2862cfe334d3e6ca1606faa6c446e14: Status 404 returned error can't find the container with id f8217e03f88b13e3b3bc11620eb69f43e2862cfe334d3e6ca1606faa6c446e14 Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.181277 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c8776b56d-szfht"] Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.203653 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9fb66bc8-jrdhh" event={"ID":"a2ace04b-27fb-40e6-9cee-6ab7d742cff7","Type":"ContainerStarted","Data":"f8217e03f88b13e3b3bc11620eb69f43e2862cfe334d3e6ca1606faa6c446e14"} Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.207444 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c8776b56d-szfht" event={"ID":"9c500176-40a9-42bd-83f6-1dc0df20484b","Type":"ContainerStarted","Data":"d3888f80012cf379466471fba100ebd63f873d82420d7d63f94a9ca3166520a1"} Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.208670 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" event={"ID":"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2","Type":"ContainerStarted","Data":"75c0ac1698ad5b97084bae244602875970b924e847ca51719de832d00f4ed0b0"} Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.210229 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b9bfc84c-zlzbm" event={"ID":"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1","Type":"ContainerStarted","Data":"3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee"} Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.210259 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b9bfc84c-zlzbm" event={"ID":"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1","Type":"ContainerStarted","Data":"26bacf4364cc15a34cb5970d252f51c3fecaaa0d28a361a83fff4142b512a12b"} Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.213031 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:03 crc kubenswrapper[4740]: I0105 14:10:03.245056 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7b9bfc84c-zlzbm" podStartSLOduration=2.245015662 podStartE2EDuration="2.245015662s" podCreationTimestamp="2026-01-05 14:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:03.23001867 +0000 UTC m=+1252.536927249" watchObservedRunningTime="2026-01-05 14:10:03.245015662 +0000 UTC m=+1252.551924241" Jan 05 14:10:04 crc kubenswrapper[4740]: I0105 14:10:04.229711 4740 generic.go:334] "Generic (PLEG): container finished" podID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerID="f3ea946a9d1c5c6d0b54824bbec0b4034a55e57be80cde59ac7029bb830a11e2" exitCode=0 Jan 05 14:10:04 crc kubenswrapper[4740]: I0105 14:10:04.229908 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" event={"ID":"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2","Type":"ContainerDied","Data":"f3ea946a9d1c5c6d0b54824bbec0b4034a55e57be80cde59ac7029bb830a11e2"} Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.256365 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" event={"ID":"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2","Type":"ContainerStarted","Data":"3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c"} Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.256768 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.279495 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" podStartSLOduration=4.279480034 podStartE2EDuration="4.279480034s" podCreationTimestamp="2026-01-05 14:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:05.271287584 +0000 UTC m=+1254.578196163" watchObservedRunningTime="2026-01-05 14:10:05.279480034 +0000 UTC m=+1254.586388613" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.396187 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7d5f88dcf-flfmp"] Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.398763 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.404308 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.404393 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.405261 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.416307 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d5f88dcf-flfmp"] Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.567635 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/981fd275-b6a0-4023-b7a0-17427043fab4-run-httpd\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/981fd275-b6a0-4023-b7a0-17427043fab4-log-httpd\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568145 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nbb\" (UniqueName: \"kubernetes.io/projected/981fd275-b6a0-4023-b7a0-17427043fab4-kube-api-access-69nbb\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/981fd275-b6a0-4023-b7a0-17427043fab4-etc-swift\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568373 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-combined-ca-bundle\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568531 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-internal-tls-certs\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568620 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-config-data\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.568764 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-public-tls-certs\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.678685 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69nbb\" (UniqueName: \"kubernetes.io/projected/981fd275-b6a0-4023-b7a0-17427043fab4-kube-api-access-69nbb\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.678756 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/981fd275-b6a0-4023-b7a0-17427043fab4-etc-swift\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.678849 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-combined-ca-bundle\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.678993 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-internal-tls-certs\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.679025 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-config-data\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.679151 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-public-tls-certs\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.679241 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/981fd275-b6a0-4023-b7a0-17427043fab4-run-httpd\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.679364 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/981fd275-b6a0-4023-b7a0-17427043fab4-log-httpd\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.679945 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/981fd275-b6a0-4023-b7a0-17427043fab4-log-httpd\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.680558 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/981fd275-b6a0-4023-b7a0-17427043fab4-run-httpd\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.694742 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-internal-tls-certs\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.695311 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-config-data\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.695483 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-combined-ca-bundle\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.695541 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/981fd275-b6a0-4023-b7a0-17427043fab4-etc-swift\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.699180 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/981fd275-b6a0-4023-b7a0-17427043fab4-public-tls-certs\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.700669 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nbb\" (UniqueName: \"kubernetes.io/projected/981fd275-b6a0-4023-b7a0-17427043fab4-kube-api-access-69nbb\") pod \"swift-proxy-7d5f88dcf-flfmp\" (UID: \"981fd275-b6a0-4023-b7a0-17427043fab4\") " pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:05 crc kubenswrapper[4740]: I0105 14:10:05.790249 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:06 crc kubenswrapper[4740]: I0105 14:10:06.527458 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:06 crc kubenswrapper[4740]: I0105 14:10:06.528424 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-central-agent" containerID="cri-o://82e8e6b99c9f49bdc55592fe4b03219dc6cc593a93cc72d1ab2c5aa24c9cd0c6" gracePeriod=30 Jan 05 14:10:06 crc kubenswrapper[4740]: I0105 14:10:06.528534 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-notification-agent" containerID="cri-o://3b029f5ff7dd3df43796804a16b4a895b0070043a317dddcfa20e34cd7d883e2" gracePeriod=30 Jan 05 14:10:06 crc kubenswrapper[4740]: I0105 14:10:06.528555 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="proxy-httpd" containerID="cri-o://39814a101297bf5534d31b9ad43153757f28a60b5ac6a6f04faf202643b2b7ec" gracePeriod=30 Jan 05 14:10:06 crc kubenswrapper[4740]: I0105 14:10:06.528740 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="sg-core" containerID="cri-o://237056f5c7edf54c862457bc646fed0e3f26ddd280d0a004c8c8ae6712b692df" gracePeriod=30 Jan 05 14:10:06 crc kubenswrapper[4740]: I0105 14:10:06.561725 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 05 14:10:07 crc kubenswrapper[4740]: I0105 14:10:07.275403 4740 generic.go:334] "Generic (PLEG): container finished" podID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerID="39814a101297bf5534d31b9ad43153757f28a60b5ac6a6f04faf202643b2b7ec" exitCode=0 Jan 05 14:10:07 crc kubenswrapper[4740]: I0105 14:10:07.275735 4740 generic.go:334] "Generic (PLEG): container finished" podID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerID="237056f5c7edf54c862457bc646fed0e3f26ddd280d0a004c8c8ae6712b692df" exitCode=2 Jan 05 14:10:07 crc kubenswrapper[4740]: I0105 14:10:07.275747 4740 generic.go:334] "Generic (PLEG): container finished" podID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerID="82e8e6b99c9f49bdc55592fe4b03219dc6cc593a93cc72d1ab2c5aa24c9cd0c6" exitCode=0 Jan 05 14:10:07 crc kubenswrapper[4740]: I0105 14:10:07.275482 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerDied","Data":"39814a101297bf5534d31b9ad43153757f28a60b5ac6a6f04faf202643b2b7ec"} Jan 05 14:10:07 crc kubenswrapper[4740]: I0105 14:10:07.275822 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerDied","Data":"237056f5c7edf54c862457bc646fed0e3f26ddd280d0a004c8c8ae6712b692df"} Jan 05 14:10:07 crc kubenswrapper[4740]: I0105 14:10:07.275837 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerDied","Data":"82e8e6b99c9f49bdc55592fe4b03219dc6cc593a93cc72d1ab2c5aa24c9cd0c6"} Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.157530 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.158920 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-log" containerID="cri-o://9d257f1cb533227dae1cbe31e1a03ab8d97935b72b69fb285052eddf1cb73c4b" gracePeriod=30 Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.159471 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-httpd" containerID="cri-o://73c75d4459e537b669822a1c2b1e48e92d76fb08eb1b0b1fe00322a5b4afb4a6" gracePeriod=30 Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.298924 4740 generic.go:334] "Generic (PLEG): container finished" podID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerID="e9cc1482edcbcb82406ec9648c0304d77917f947a6470b9aa5aadad1df652348" exitCode=137 Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.298991 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b8e0a645-f5dd-4e8f-a166-310bec2301cd","Type":"ContainerDied","Data":"e9cc1482edcbcb82406ec9648c0304d77917f947a6470b9aa5aadad1df652348"} Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.309759 4740 generic.go:334] "Generic (PLEG): container finished" podID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerID="9d257f1cb533227dae1cbe31e1a03ab8d97935b72b69fb285052eddf1cb73c4b" exitCode=143 Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.309805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ecdaebe5-8250-4c27-a69a-3b1ebd335a48","Type":"ContainerDied","Data":"9d257f1cb533227dae1cbe31e1a03ab8d97935b72b69fb285052eddf1cb73c4b"} Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.338926 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.213:3000/\": dial tcp 10.217.0.213:3000: connect: connection refused" Jan 05 14:10:09 crc kubenswrapper[4740]: E0105 14:10:09.382834 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecdaebe5_8250_4c27_a69a_3b1ebd335a48.slice/crio-conmon-9d257f1cb533227dae1cbe31e1a03ab8d97935b72b69fb285052eddf1cb73c4b.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.587391 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.212:8776/healthcheck\": dial tcp 10.217.0.212:8776: connect: connection refused" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.655281 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-d7697fc9-j6p2f"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.657283 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.672492 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-d7697fc9-j6p2f"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.683123 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5879f5d6d6-rz6db"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.684659 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.684691 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data-custom\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.684932 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krvt6\" (UniqueName: \"kubernetes.io/projected/42341a11-18a3-4ef1-8e12-a9450f68370e-kube-api-access-krvt6\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.685021 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.685109 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-combined-ca-bundle\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.696866 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-599d7559df-v72j9"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.699513 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.729873 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5879f5d6d6-rz6db"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.784455 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-599d7559df-v72j9"] Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787169 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krvt6\" (UniqueName: \"kubernetes.io/projected/42341a11-18a3-4ef1-8e12-a9450f68370e-kube-api-access-krvt6\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787250 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787273 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787291 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v768n\" (UniqueName: \"kubernetes.io/projected/49f975a3-3b3b-4dcf-9199-ec3acf657062-kube-api-access-v768n\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-combined-ca-bundle\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787357 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787390 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data-custom\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787409 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-combined-ca-bundle\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787460 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data-custom\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787508 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtb78\" (UniqueName: \"kubernetes.io/projected/49f587f8-8537-4ed2-a04f-715a8ea4781c-kube-api-access-rtb78\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787528 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-combined-ca-bundle\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.787548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data-custom\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.795464 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-combined-ca-bundle\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.796396 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data-custom\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.800099 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.809943 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krvt6\" (UniqueName: \"kubernetes.io/projected/42341a11-18a3-4ef1-8e12-a9450f68370e-kube-api-access-krvt6\") pod \"heat-engine-d7697fc9-j6p2f\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.891434 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-combined-ca-bundle\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892187 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data-custom\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892320 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892344 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v768n\" (UniqueName: \"kubernetes.io/projected/49f975a3-3b3b-4dcf-9199-ec3acf657062-kube-api-access-v768n\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892428 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892475 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-combined-ca-bundle\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892573 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data-custom\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.892647 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtb78\" (UniqueName: \"kubernetes.io/projected/49f587f8-8537-4ed2-a04f-715a8ea4781c-kube-api-access-rtb78\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.896969 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data-custom\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.897995 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.898613 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-combined-ca-bundle\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.898699 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-combined-ca-bundle\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.905758 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data-custom\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.906225 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.910328 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtb78\" (UniqueName: \"kubernetes.io/projected/49f587f8-8537-4ed2-a04f-715a8ea4781c-kube-api-access-rtb78\") pod \"heat-cfnapi-5879f5d6d6-rz6db\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.911003 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v768n\" (UniqueName: \"kubernetes.io/projected/49f975a3-3b3b-4dcf-9199-ec3acf657062-kube-api-access-v768n\") pod \"heat-api-599d7559df-v72j9\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:09 crc kubenswrapper[4740]: I0105 14:10:09.978266 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.011336 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.035311 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.217894 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9hbx9"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.219358 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.251502 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9hbx9"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.303430 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98s5g\" (UniqueName: \"kubernetes.io/projected/cadffc7b-e58f-4f7e-b312-d477c2ad9429-kube-api-access-98s5g\") pod \"nova-api-db-create-9hbx9\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.303548 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cadffc7b-e58f-4f7e-b312-d477c2ad9429-operator-scripts\") pod \"nova-api-db-create-9hbx9\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.328777 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pgxvc"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.330197 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.349726 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pgxvc"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.408515 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef0a3cf-3805-44a3-9c13-601726874901-operator-scripts\") pod \"nova-cell0-db-create-pgxvc\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.408660 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98s5g\" (UniqueName: \"kubernetes.io/projected/cadffc7b-e58f-4f7e-b312-d477c2ad9429-kube-api-access-98s5g\") pod \"nova-api-db-create-9hbx9\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.408689 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb5h2\" (UniqueName: \"kubernetes.io/projected/2ef0a3cf-3805-44a3-9c13-601726874901-kube-api-access-wb5h2\") pod \"nova-cell0-db-create-pgxvc\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.408718 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cadffc7b-e58f-4f7e-b312-d477c2ad9429-operator-scripts\") pod \"nova-api-db-create-9hbx9\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.409639 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cadffc7b-e58f-4f7e-b312-d477c2ad9429-operator-scripts\") pod \"nova-api-db-create-9hbx9\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.417696 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-z6tgb"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.419034 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.447992 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z6tgb"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.455681 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98s5g\" (UniqueName: \"kubernetes.io/projected/cadffc7b-e58f-4f7e-b312-d477c2ad9429-kube-api-access-98s5g\") pod \"nova-api-db-create-9hbx9\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.491468 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5490-account-create-update-mq5tj"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.522295 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.537550 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.544602 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef0a3cf-3805-44a3-9c13-601726874901-operator-scripts\") pod \"nova-cell0-db-create-pgxvc\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.544802 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386101d7-0cc9-471d-8fe8-95af3e74a121-operator-scripts\") pod \"nova-cell1-db-create-z6tgb\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.545008 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxmg\" (UniqueName: \"kubernetes.io/projected/386101d7-0cc9-471d-8fe8-95af3e74a121-kube-api-access-fnxmg\") pod \"nova-cell1-db-create-z6tgb\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.545119 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb5h2\" (UniqueName: \"kubernetes.io/projected/2ef0a3cf-3805-44a3-9c13-601726874901-kube-api-access-wb5h2\") pod \"nova-cell0-db-create-pgxvc\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.545737 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef0a3cf-3805-44a3-9c13-601726874901-operator-scripts\") pod \"nova-cell0-db-create-pgxvc\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.554658 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.605610 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb5h2\" (UniqueName: \"kubernetes.io/projected/2ef0a3cf-3805-44a3-9c13-601726874901-kube-api-access-wb5h2\") pod \"nova-cell0-db-create-pgxvc\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.652144 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.655205 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csp68\" (UniqueName: \"kubernetes.io/projected/0b8e97e5-739e-4c24-9768-e6a1fb56bace-kube-api-access-csp68\") pod \"nova-api-5490-account-create-update-mq5tj\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.655243 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386101d7-0cc9-471d-8fe8-95af3e74a121-operator-scripts\") pod \"nova-cell1-db-create-z6tgb\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.655324 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxmg\" (UniqueName: \"kubernetes.io/projected/386101d7-0cc9-471d-8fe8-95af3e74a121-kube-api-access-fnxmg\") pod \"nova-cell1-db-create-z6tgb\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.655386 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8e97e5-739e-4c24-9768-e6a1fb56bace-operator-scripts\") pod \"nova-api-5490-account-create-update-mq5tj\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.656375 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386101d7-0cc9-471d-8fe8-95af3e74a121-operator-scripts\") pod \"nova-cell1-db-create-z6tgb\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.663240 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5490-account-create-update-mq5tj"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.685434 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxmg\" (UniqueName: \"kubernetes.io/projected/386101d7-0cc9-471d-8fe8-95af3e74a121-kube-api-access-fnxmg\") pod \"nova-cell1-db-create-z6tgb\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.702025 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4928-account-create-update-h89c7"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.703941 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.705886 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.719087 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4928-account-create-update-h89c7"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.757861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wt5b\" (UniqueName: \"kubernetes.io/projected/a6785a14-5ddd-471f-ac7d-eed48bfacd44-kube-api-access-4wt5b\") pod \"nova-cell0-4928-account-create-update-h89c7\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.757971 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csp68\" (UniqueName: \"kubernetes.io/projected/0b8e97e5-739e-4c24-9768-e6a1fb56bace-kube-api-access-csp68\") pod \"nova-api-5490-account-create-update-mq5tj\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.758869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8e97e5-739e-4c24-9768-e6a1fb56bace-operator-scripts\") pod \"nova-api-5490-account-create-update-mq5tj\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.758907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6785a14-5ddd-471f-ac7d-eed48bfacd44-operator-scripts\") pod \"nova-cell0-4928-account-create-update-h89c7\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.759742 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8e97e5-739e-4c24-9768-e6a1fb56bace-operator-scripts\") pod \"nova-api-5490-account-create-update-mq5tj\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.762284 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.778883 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csp68\" (UniqueName: \"kubernetes.io/projected/0b8e97e5-739e-4c24-9768-e6a1fb56bace-kube-api-access-csp68\") pod \"nova-api-5490-account-create-update-mq5tj\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.837485 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6419-account-create-update-fgr2l"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.840021 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.842612 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.861176 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6785a14-5ddd-471f-ac7d-eed48bfacd44-operator-scripts\") pod \"nova-cell0-4928-account-create-update-h89c7\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.861584 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wt5b\" (UniqueName: \"kubernetes.io/projected/a6785a14-5ddd-471f-ac7d-eed48bfacd44-kube-api-access-4wt5b\") pod \"nova-cell0-4928-account-create-update-h89c7\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.863166 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6785a14-5ddd-471f-ac7d-eed48bfacd44-operator-scripts\") pod \"nova-cell0-4928-account-create-update-h89c7\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.877905 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6419-account-create-update-fgr2l"] Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.896139 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.905169 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wt5b\" (UniqueName: \"kubernetes.io/projected/a6785a14-5ddd-471f-ac7d-eed48bfacd44-kube-api-access-4wt5b\") pod \"nova-cell0-4928-account-create-update-h89c7\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.965686 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtbd\" (UniqueName: \"kubernetes.io/projected/e21239f6-f726-4394-b6fe-f7f7f438d7b5-kube-api-access-krtbd\") pod \"nova-cell1-6419-account-create-update-fgr2l\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:10 crc kubenswrapper[4740]: I0105 14:10:10.965832 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21239f6-f726-4394-b6fe-f7f7f438d7b5-operator-scripts\") pod \"nova-cell1-6419-account-create-update-fgr2l\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.023015 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.067568 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krtbd\" (UniqueName: \"kubernetes.io/projected/e21239f6-f726-4394-b6fe-f7f7f438d7b5-kube-api-access-krtbd\") pod \"nova-cell1-6419-account-create-update-fgr2l\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.067724 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21239f6-f726-4394-b6fe-f7f7f438d7b5-operator-scripts\") pod \"nova-cell1-6419-account-create-update-fgr2l\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.068479 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21239f6-f726-4394-b6fe-f7f7f438d7b5-operator-scripts\") pod \"nova-cell1-6419-account-create-update-fgr2l\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.094801 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krtbd\" (UniqueName: \"kubernetes.io/projected/e21239f6-f726-4394-b6fe-f7f7f438d7b5-kube-api-access-krtbd\") pod \"nova-cell1-6419-account-create-update-fgr2l\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.189002 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.420776 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.421014 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-log" containerID="cri-o://31abe96eff881f2e28f34c9a99c430103b86b759958fd027c0a3fb182a9ad099" gracePeriod=30 Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.421163 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-httpd" containerID="cri-o://a64df19c895975cf0990e36d21938e7dea57cd874d75f3d8562b5365c131a1fc" gracePeriod=30 Jan 05 14:10:11 crc kubenswrapper[4740]: I0105 14:10:11.966189 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.043450 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8qhgf"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.043760 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="dnsmasq-dns" containerID="cri-o://37113591b12f42e7504421b7b942eb220d3cb9518e7bc8bd8578e9b1b53a28ac" gracePeriod=10 Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.349329 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.197:9292/healthcheck\": read tcp 10.217.0.2:38122->10.217.0.197:9292: read: connection reset by peer" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.349332 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.197:9292/healthcheck\": read tcp 10.217.0.2:38120->10.217.0.197:9292: read: connection reset by peer" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.354055 4740 generic.go:334] "Generic (PLEG): container finished" podID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerID="37113591b12f42e7504421b7b942eb220d3cb9518e7bc8bd8578e9b1b53a28ac" exitCode=0 Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.354118 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" event={"ID":"cf33de39-642c-4ea4-8b3b-e505d43016f8","Type":"ContainerDied","Data":"37113591b12f42e7504421b7b942eb220d3cb9518e7bc8bd8578e9b1b53a28ac"} Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.362270 4740 generic.go:334] "Generic (PLEG): container finished" podID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerID="31abe96eff881f2e28f34c9a99c430103b86b759958fd027c0a3fb182a9ad099" exitCode=143 Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.362393 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfe7c040-25ec-49bc-96a4-127e4281ae18","Type":"ContainerDied","Data":"31abe96eff881f2e28f34c9a99c430103b86b759958fd027c0a3fb182a9ad099"} Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.371966 4740 generic.go:334] "Generic (PLEG): container finished" podID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerID="3b029f5ff7dd3df43796804a16b4a895b0070043a317dddcfa20e34cd7d883e2" exitCode=0 Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.372051 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerDied","Data":"3b029f5ff7dd3df43796804a16b4a895b0070043a317dddcfa20e34cd7d883e2"} Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.539718 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d9fb66bc8-jrdhh"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.551655 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c8776b56d-szfht"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.574218 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-75bf4fb87-2g4f8"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.575718 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.578788 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.588037 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.591899 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-8669649c9d-h45rr"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.593568 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.598816 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.599079 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.621510 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-75bf4fb87-2g4f8"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.646159 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8669649c9d-h45rr"] Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708607 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlfzb\" (UniqueName: \"kubernetes.io/projected/116ff769-c12d-42af-93a9-549001762acd-kube-api-access-jlfzb\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708749 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708768 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-internal-tls-certs\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708786 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-internal-tls-certs\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708833 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-combined-ca-bundle\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708853 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-combined-ca-bundle\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708882 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw66z\" (UniqueName: \"kubernetes.io/projected/9307c633-b44c-4f4d-8414-9f28826c30dc-kube-api-access-kw66z\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-public-tls-certs\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708922 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data-custom\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708944 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-public-tls-certs\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.708969 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data-custom\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810478 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlfzb\" (UniqueName: \"kubernetes.io/projected/116ff769-c12d-42af-93a9-549001762acd-kube-api-access-jlfzb\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810559 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-internal-tls-certs\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810621 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-internal-tls-certs\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810667 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-combined-ca-bundle\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-combined-ca-bundle\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810713 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw66z\" (UniqueName: \"kubernetes.io/projected/9307c633-b44c-4f4d-8414-9f28826c30dc-kube-api-access-kw66z\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810737 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-public-tls-certs\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810754 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data-custom\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810773 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-public-tls-certs\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.810801 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data-custom\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.816900 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-internal-tls-certs\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.824445 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-public-tls-certs\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.824611 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-public-tls-certs\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.825743 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data-custom\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.826179 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.828741 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-combined-ca-bundle\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.829990 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-internal-tls-certs\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.832646 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data-custom\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.841726 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.841889 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw66z\" (UniqueName: \"kubernetes.io/projected/9307c633-b44c-4f4d-8414-9f28826c30dc-kube-api-access-kw66z\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.842580 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-combined-ca-bundle\") pod \"heat-cfnapi-75bf4fb87-2g4f8\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.862882 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlfzb\" (UniqueName: \"kubernetes.io/projected/116ff769-c12d-42af-93a9-549001762acd-kube-api-access-jlfzb\") pod \"heat-api-8669649c9d-h45rr\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.918776 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:12 crc kubenswrapper[4740]: I0105 14:10:12.929784 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:13 crc kubenswrapper[4740]: I0105 14:10:13.391509 4740 generic.go:334] "Generic (PLEG): container finished" podID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerID="73c75d4459e537b669822a1c2b1e48e92d76fb08eb1b0b1fe00322a5b4afb4a6" exitCode=0 Jan 05 14:10:13 crc kubenswrapper[4740]: I0105 14:10:13.391550 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ecdaebe5-8250-4c27-a69a-3b1ebd335a48","Type":"ContainerDied","Data":"73c75d4459e537b669822a1c2b1e48e92d76fb08eb1b0b1fe00322a5b4afb4a6"} Jan 05 14:10:14 crc kubenswrapper[4740]: I0105 14:10:14.436715 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.211:5353: connect: connection refused" Jan 05 14:10:14 crc kubenswrapper[4740]: I0105 14:10:14.585594 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.212:8776/healthcheck\": dial tcp 10.217.0.212:8776: connect: connection refused" Jan 05 14:10:14 crc kubenswrapper[4740]: E0105 14:10:14.681149 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe7c040_25ec_49bc_96a4_127e4281ae18.slice/crio-a64df19c895975cf0990e36d21938e7dea57cd874d75f3d8562b5365c131a1fc.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.440660 4740 generic.go:334] "Generic (PLEG): container finished" podID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerID="a64df19c895975cf0990e36d21938e7dea57cd874d75f3d8562b5365c131a1fc" exitCode=0 Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.441106 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfe7c040-25ec-49bc-96a4-127e4281ae18","Type":"ContainerDied","Data":"a64df19c895975cf0990e36d21938e7dea57cd874d75f3d8562b5365c131a1fc"} Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.795260 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.890975 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-sg-core-conf-yaml\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.891198 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-config-data\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.891287 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-combined-ca-bundle\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.891373 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-run-httpd\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.891406 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zhfj\" (UniqueName: \"kubernetes.io/projected/953b81f2-4ef4-4837-ae64-fed14572e2e6-kube-api-access-5zhfj\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.891581 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-scripts\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.891629 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-log-httpd\") pod \"953b81f2-4ef4-4837-ae64-fed14572e2e6\" (UID: \"953b81f2-4ef4-4837-ae64-fed14572e2e6\") " Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.893112 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.904933 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.993916 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.993943 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/953b81f2-4ef4-4837-ae64-fed14572e2e6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:15 crc kubenswrapper[4740]: I0105 14:10:15.996278 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-scripts" (OuterVolumeSpecName: "scripts") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:15.999157 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953b81f2-4ef4-4837-ae64-fed14572e2e6-kube-api-access-5zhfj" (OuterVolumeSpecName: "kube-api-access-5zhfj") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "kube-api-access-5zhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.020438 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.095864 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.096209 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zhfj\" (UniqueName: \"kubernetes.io/projected/953b81f2-4ef4-4837-ae64-fed14572e2e6-kube-api-access-5zhfj\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.096224 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.182111 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.186230 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-config-data" (OuterVolumeSpecName: "config-data") pod "953b81f2-4ef4-4837-ae64-fed14572e2e6" (UID: "953b81f2-4ef4-4837-ae64-fed14572e2e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.197812 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.197844 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953b81f2-4ef4-4837-ae64-fed14572e2e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.473651 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"953b81f2-4ef4-4837-ae64-fed14572e2e6","Type":"ContainerDied","Data":"0bce1d13410bfc8006fdec2aeba57752cbad00b9240c1ffdac87df24e1e32f1e"} Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.474031 4740 scope.go:117] "RemoveContainer" containerID="39814a101297bf5534d31b9ad43153757f28a60b5ac6a6f04faf202643b2b7ec" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.474231 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.497582 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" event={"ID":"cf33de39-642c-4ea4-8b3b-e505d43016f8","Type":"ContainerDied","Data":"ce478ddcede8b3f40ac08540280df66fd2b97b4a4ac81f064f25c50e91e7555b"} Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.497625 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce478ddcede8b3f40ac08540280df66fd2b97b4a4ac81f064f25c50e91e7555b" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.507798 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfe7c040-25ec-49bc-96a4-127e4281ae18","Type":"ContainerDied","Data":"b54f150460b0e76671945aef5c2a65167e930442b442e657e2411565130ff768"} Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.507836 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b54f150460b0e76671945aef5c2a65167e930442b442e657e2411565130ff768" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.511264 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b8e0a645-f5dd-4e8f-a166-310bec2301cd","Type":"ContainerDied","Data":"d23047f23e56252595ac5618a8d6a0bb78eaf058aca74dbb79a1be696e36ba5d"} Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.511287 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23047f23e56252595ac5618a8d6a0bb78eaf058aca74dbb79a1be696e36ba5d" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.517988 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.526744 4740 scope.go:117] "RemoveContainer" containerID="237056f5c7edf54c862457bc646fed0e3f26ddd280d0a004c8c8ae6712b692df" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.565447 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.588398 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.610121 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.614942 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtwx\" (UniqueName: \"kubernetes.io/projected/cfe7c040-25ec-49bc-96a4-127e4281ae18-kube-api-access-cbtwx\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.615005 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8e0a645-f5dd-4e8f-a166-310bec2301cd-logs\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.615032 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-config\") pod \"cf33de39-642c-4ea4-8b3b-e505d43016f8\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616138 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-combined-ca-bundle\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616239 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-httpd-run\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616263 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-combined-ca-bundle\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616292 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8e0a645-f5dd-4e8f-a166-310bec2301cd-etc-machine-id\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616327 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-scripts\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data-custom\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616366 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-nb\") pod \"cf33de39-642c-4ea4-8b3b-e505d43016f8\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616380 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-swift-storage-0\") pod \"cf33de39-642c-4ea4-8b3b-e505d43016f8\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616416 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-scripts\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616477 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-config-data\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616505 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2s8\" (UniqueName: \"kubernetes.io/projected/b8e0a645-f5dd-4e8f-a166-310bec2301cd-kube-api-access-sn2s8\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616528 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-logs\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616552 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-svc\") pod \"cf33de39-642c-4ea4-8b3b-e505d43016f8\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616606 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv8hd\" (UniqueName: \"kubernetes.io/projected/cf33de39-642c-4ea4-8b3b-e505d43016f8-kube-api-access-tv8hd\") pod \"cf33de39-642c-4ea4-8b3b-e505d43016f8\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616642 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data\") pod \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\" (UID: \"b8e0a645-f5dd-4e8f-a166-310bec2301cd\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616665 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-sb\") pod \"cf33de39-642c-4ea4-8b3b-e505d43016f8\" (UID: \"cf33de39-642c-4ea4-8b3b-e505d43016f8\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.616692 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-internal-tls-certs\") pod \"cfe7c040-25ec-49bc-96a4-127e4281ae18\" (UID: \"cfe7c040-25ec-49bc-96a4-127e4281ae18\") " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.618948 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-logs" (OuterVolumeSpecName: "logs") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.620101 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.620533 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e0a645-f5dd-4e8f-a166-310bec2301cd-logs" (OuterVolumeSpecName: "logs") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.629850 4740 scope.go:117] "RemoveContainer" containerID="3b029f5ff7dd3df43796804a16b4a895b0070043a317dddcfa20e34cd7d883e2" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.631754 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.649044 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8e0a645-f5dd-4e8f-a166-310bec2301cd-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.649083 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.649099 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfe7c040-25ec-49bc-96a4-127e4281ae18-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.652237 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8e0a645-f5dd-4e8f-a166-310bec2301cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.656657 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-scripts" (OuterVolumeSpecName: "scripts") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.656738 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e0a645-f5dd-4e8f-a166-310bec2301cd-kube-api-access-sn2s8" (OuterVolumeSpecName: "kube-api-access-sn2s8") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "kube-api-access-sn2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.662991 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf33de39-642c-4ea4-8b3b-e505d43016f8-kube-api-access-tv8hd" (OuterVolumeSpecName: "kube-api-access-tv8hd") pod "cf33de39-642c-4ea4-8b3b-e505d43016f8" (UID: "cf33de39-642c-4ea4-8b3b-e505d43016f8"). InnerVolumeSpecName "kube-api-access-tv8hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.665370 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667111 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="init" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667129 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="init" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667144 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api-log" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667153 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api-log" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667171 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-central-agent" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667178 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-central-agent" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667195 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="dnsmasq-dns" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667218 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="dnsmasq-dns" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667233 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="sg-core" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667238 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="sg-core" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667257 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-notification-agent" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667263 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-notification-agent" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667279 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-httpd" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667284 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-httpd" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667291 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-log" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667296 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-log" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667305 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="proxy-httpd" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667310 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="proxy-httpd" Jan 05 14:10:16 crc kubenswrapper[4740]: E0105 14:10:16.667357 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.667363 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.673339 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-central-agent" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.673993 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="proxy-httpd" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674025 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api-log" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674047 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="sg-core" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674149 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-log" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674176 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" containerName="cinder-api" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674191 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" containerName="glance-httpd" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674205 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" containerName="dnsmasq-dns" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.674238 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" containerName="ceilometer-notification-agent" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.680629 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.687377 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.687625 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.692825 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.701493 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe7c040-25ec-49bc-96a4-127e4281ae18-kube-api-access-cbtwx" (OuterVolumeSpecName: "kube-api-access-cbtwx") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "kube-api-access-cbtwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.712084 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-scripts" (OuterVolumeSpecName: "scripts") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.732118 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753036 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753166 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753268 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dp4\" (UniqueName: \"kubernetes.io/projected/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-kube-api-access-s9dp4\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753298 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-log-httpd\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753351 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-run-httpd\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753408 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-scripts\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753429 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-config-data\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753537 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv8hd\" (UniqueName: \"kubernetes.io/projected/cf33de39-642c-4ea4-8b3b-e505d43016f8-kube-api-access-tv8hd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753555 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtwx\" (UniqueName: \"kubernetes.io/projected/cfe7c040-25ec-49bc-96a4-127e4281ae18-kube-api-access-cbtwx\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753569 4740 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b8e0a645-f5dd-4e8f-a166-310bec2301cd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753580 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753591 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753602 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.753614 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2s8\" (UniqueName: \"kubernetes.io/projected/b8e0a645-f5dd-4e8f-a166-310bec2301cd-kube-api-access-sn2s8\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.830438 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57" (OuterVolumeSpecName: "glance") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856013 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dp4\" (UniqueName: \"kubernetes.io/projected/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-kube-api-access-s9dp4\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856078 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-log-httpd\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856120 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-run-httpd\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856168 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-scripts\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856183 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-config-data\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856258 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856303 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.856388 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") on node \"crc\" " Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.860797 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-log-httpd\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.861095 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-run-httpd\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.891349 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.896213 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dp4\" (UniqueName: \"kubernetes.io/projected/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-kube-api-access-s9dp4\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.896571 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.898006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-scripts\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.901780 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-config-data\") pod \"ceilometer-0\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " pod="openstack/ceilometer-0" Jan 05 14:10:16 crc kubenswrapper[4740]: I0105 14:10:16.992054 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953b81f2-4ef4-4837-ae64-fed14572e2e6" path="/var/lib/kubelet/pods/953b81f2-4ef4-4837-ae64-fed14572e2e6/volumes" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.224962 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf33de39-642c-4ea4-8b3b-e505d43016f8" (UID: "cf33de39-642c-4ea4-8b3b-e505d43016f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: W0105 14:10:17.232937 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42341a11_18a3_4ef1_8e12_a9450f68370e.slice/crio-e53b5163e6783a5ee921103e1e5209438505b378f377f822cbd66c917f18352c WatchSource:0}: Error finding container e53b5163e6783a5ee921103e1e5209438505b378f377f822cbd66c917f18352c: Status 404 returned error can't find the container with id e53b5163e6783a5ee921103e1e5209438505b378f377f822cbd66c917f18352c Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.258212 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.278433 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.278590 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57") on node "crc" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.284591 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 05 14:10:17 crc kubenswrapper[4740]: W0105 14:10:17.284888 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9307c633_b44c_4f4d_8414_9f28826c30dc.slice/crio-74f26ad623f47a8a4ec6e4de16fe4ab04b71dfc46b57b0a134c5c29d5055c1c2 WatchSource:0}: Error finding container 74f26ad623f47a8a4ec6e4de16fe4ab04b71dfc46b57b0a134c5c29d5055c1c2: Status 404 returned error can't find the container with id 74f26ad623f47a8a4ec6e4de16fe4ab04b71dfc46b57b0a134c5c29d5055c1c2 Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.299669 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.301597 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data" (OuterVolumeSpecName: "config-data") pod "b8e0a645-f5dd-4e8f-a166-310bec2301cd" (UID: "b8e0a645-f5dd-4e8f-a166-310bec2301cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.312900 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf33de39-642c-4ea4-8b3b-e505d43016f8" (UID: "cf33de39-642c-4ea4-8b3b-e505d43016f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.320470 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.320546 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.320568 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.320585 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.330676 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf33de39-642c-4ea4-8b3b-e505d43016f8" (UID: "cf33de39-642c-4ea4-8b3b-e505d43016f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.349007 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf33de39-642c-4ea4-8b3b-e505d43016f8" (UID: "cf33de39-642c-4ea4-8b3b-e505d43016f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.364782 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-config" (OuterVolumeSpecName: "config") pod "cf33de39-642c-4ea4-8b3b-e505d43016f8" (UID: "cf33de39-642c-4ea4-8b3b-e505d43016f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.365543 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.392748 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-config-data" (OuterVolumeSpecName: "config-data") pod "cfe7c040-25ec-49bc-96a4-127e4281ae18" (UID: "cfe7c040-25ec-49bc-96a4-127e4281ae18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.427723 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.428050 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.438187 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8e0a645-f5dd-4e8f-a166-310bec2301cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.438223 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.438238 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.438248 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe7c040-25ec-49bc-96a4-127e4281ae18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.438257 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.438268 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf33de39-642c-4ea4-8b3b-e505d43016f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.453966 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4928-account-create-update-h89c7"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.453998 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-d7697fc9-j6p2f"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.454011 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6419-account-create-update-fgr2l"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.454023 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-75bf4fb87-2g4f8"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.481587 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.484187 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d5f88dcf-flfmp"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.488354 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.517690 4740 scope.go:117] "RemoveContainer" containerID="82e8e6b99c9f49bdc55592fe4b03219dc6cc593a93cc72d1ab2c5aa24c9cd0c6" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.540247 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-scripts\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.540289 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whnpg\" (UniqueName: \"kubernetes.io/projected/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-kube-api-access-whnpg\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.540360 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-config-data\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.540414 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-combined-ca-bundle\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.540478 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-public-tls-certs\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.540495 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-logs\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.541054 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.541135 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-httpd-run\") pod \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\" (UID: \"ecdaebe5-8250-4c27-a69a-3b1ebd335a48\") " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.551850 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.554120 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-logs" (OuterVolumeSpecName: "logs") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.564541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9fb66bc8-jrdhh" event={"ID":"a2ace04b-27fb-40e6-9cee-6ab7d742cff7","Type":"ContainerStarted","Data":"75ed23573953c0c6bdd2f2884fa5a323a3699a6921bd1ca83e8cf4e76851b2c1"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.564692 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-d9fb66bc8-jrdhh" podUID="a2ace04b-27fb-40e6-9cee-6ab7d742cff7" containerName="heat-api" containerID="cri-o://75ed23573953c0c6bdd2f2884fa5a323a3699a6921bd1ca83e8cf4e76851b2c1" gracePeriod=60 Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.565188 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.600630 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6605f3c2-8a9f-45d4-9060-8e44e4bccac6","Type":"ContainerStarted","Data":"c8c6fe74f2455ffca9f8c4875922749d18f5e2d56138498668491781647d0af8"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.619506 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-scripts" (OuterVolumeSpecName: "scripts") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.626749 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-kube-api-access-whnpg" (OuterVolumeSpecName: "kube-api-access-whnpg") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "kube-api-access-whnpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.628046 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-d9fb66bc8-jrdhh" podStartSLOduration=4.36163572 podStartE2EDuration="16.62802808s" podCreationTimestamp="2026-01-05 14:10:01 +0000 UTC" firstStartedPulling="2026-01-05 14:10:02.960849793 +0000 UTC m=+1252.267758372" lastFinishedPulling="2026-01-05 14:10:15.227242153 +0000 UTC m=+1264.534150732" observedRunningTime="2026-01-05 14:10:17.62469673 +0000 UTC m=+1266.931605309" watchObservedRunningTime="2026-01-05 14:10:17.62802808 +0000 UTC m=+1266.934936659" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.628373 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" event={"ID":"e21239f6-f726-4394-b6fe-f7f7f438d7b5","Type":"ContainerStarted","Data":"670b7bf9a96529efb1ee2d8405b4f182e0a328c9564620546480eebf2f17b0f3"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.647229 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.647265 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whnpg\" (UniqueName: \"kubernetes.io/projected/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-kube-api-access-whnpg\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.647275 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.647284 4740 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.657307 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4928-account-create-update-h89c7" event={"ID":"a6785a14-5ddd-471f-ac7d-eed48bfacd44","Type":"ContainerStarted","Data":"f27e37cb419074e8525d893406715a9209485faa8f77626f7b27bbb950dc2516"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.663249 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9hbx9"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.691216 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pgxvc"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.699892 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" event={"ID":"9307c633-b44c-4f4d-8414-9f28826c30dc","Type":"ContainerStarted","Data":"74f26ad623f47a8a4ec6e4de16fe4ab04b71dfc46b57b0a134c5c29d5055c1c2"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.752975 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.100795308 podStartE2EDuration="22.752956933s" podCreationTimestamp="2026-01-05 14:09:55 +0000 UTC" firstStartedPulling="2026-01-05 14:09:56.702626182 +0000 UTC m=+1246.009534761" lastFinishedPulling="2026-01-05 14:10:15.354787807 +0000 UTC m=+1264.661696386" observedRunningTime="2026-01-05 14:10:17.663227694 +0000 UTC m=+1266.970136273" watchObservedRunningTime="2026-01-05 14:10:17.752956933 +0000 UTC m=+1267.059865512" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.776000 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ecdaebe5-8250-4c27-a69a-3b1ebd335a48","Type":"ContainerDied","Data":"66bbf1604c30e7418cda7085cbf1cfedb8a4aec4860ef67fd075977465896b00"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.776057 4740 scope.go:117] "RemoveContainer" containerID="73c75d4459e537b669822a1c2b1e48e92d76fb08eb1b0b1fe00322a5b4afb4a6" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.776231 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.798868 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z6tgb"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.812158 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba" (OuterVolumeSpecName: "glance") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.837130 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d5f88dcf-flfmp" event={"ID":"981fd275-b6a0-4023-b7a0-17427043fab4","Type":"ContainerStarted","Data":"eb147c8480e6be644da563587fc08e4f8565c04007dfb118c74cbe7f97ba4e3a"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.847451 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-599d7559df-v72j9"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.847485 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c8776b56d-szfht" event={"ID":"9c500176-40a9-42bd-83f6-1dc0df20484b","Type":"ContainerStarted","Data":"b8fb83913b2a96a66327269f7aea014b20db1841f90afc12ae327e8d5ac758d3"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.847485 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6c8776b56d-szfht" podUID="9c500176-40a9-42bd-83f6-1dc0df20484b" containerName="heat-cfnapi" containerID="cri-o://b8fb83913b2a96a66327269f7aea014b20db1841f90afc12ae327e8d5ac758d3" gracePeriod=60 Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.847930 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.851448 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.854380 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-8qhgf" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.854992 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d7697fc9-j6p2f" event={"ID":"42341a11-18a3-4ef1-8e12-a9450f68370e","Type":"ContainerStarted","Data":"e53b5163e6783a5ee921103e1e5209438505b378f377f822cbd66c917f18352c"} Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.855109 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.871022 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") on node \"crc\" " Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.887219 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5879f5d6d6-rz6db"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.899156 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8669649c9d-h45rr"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.909638 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5490-account-create-update-mq5tj"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.910823 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6c8776b56d-szfht" podStartSLOduration=5.037062115 podStartE2EDuration="16.910805962s" podCreationTimestamp="2026-01-05 14:10:01 +0000 UTC" firstStartedPulling="2026-01-05 14:10:03.181701193 +0000 UTC m=+1252.488609782" lastFinishedPulling="2026-01-05 14:10:15.05544505 +0000 UTC m=+1264.362353629" observedRunningTime="2026-01-05 14:10:17.868817205 +0000 UTC m=+1267.175725794" watchObservedRunningTime="2026-01-05 14:10:17.910805962 +0000 UTC m=+1267.217714541" Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.940984 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.957200 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:10:17 crc kubenswrapper[4740]: I0105 14:10:17.993360 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8qhgf"] Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.011813 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.014810 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-8qhgf"] Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.022477 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.024272 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-config-data" (OuterVolumeSpecName: "config-data") pod "ecdaebe5-8250-4c27-a69a-3b1ebd335a48" (UID: "ecdaebe5-8250-4c27-a69a-3b1ebd335a48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.029308 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:10:18 crc kubenswrapper[4740]: E0105 14:10:18.029776 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-httpd" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.029792 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-httpd" Jan 05 14:10:18 crc kubenswrapper[4740]: E0105 14:10:18.029837 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-log" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.029844 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-log" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.030045 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-httpd" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.030091 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" containerName="glance-log" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.031276 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.037638 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.037892 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.038157 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.048332 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.063237 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:10:18 crc kubenswrapper[4740]: I0105 14:10:18.063385 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba") on node "crc" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.080170 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.098808 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.098833 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.098843 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.098853 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdaebe5-8250-4c27-a69a-3b1ebd335a48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.114017 4740 scope.go:117] "RemoveContainer" containerID="9d257f1cb533227dae1cbe31e1a03ab8d97935b72b69fb285052eddf1cb73c4b" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.132586 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.132617 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.146704 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.148724 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.152133 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.152270 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.152286 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.152392 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kw6zt" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.207777 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6206351-05ef-4835-9e76-0002c5eca516-logs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.207952 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.207999 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-scripts\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.208161 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-config-data\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.208252 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.208297 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.208427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6206351-05ef-4835-9e76-0002c5eca516-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.208573 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-config-data-custom\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.208611 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czr8h\" (UniqueName: \"kubernetes.io/projected/b6206351-05ef-4835-9e76-0002c5eca516-kube-api-access-czr8h\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.211776 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.270000 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310473 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pvlm\" (UniqueName: \"kubernetes.io/projected/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-kube-api-access-6pvlm\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310597 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6206351-05ef-4835-9e76-0002c5eca516-logs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310624 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310684 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310702 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310721 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-scripts\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310735 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-logs\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310766 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-config-data\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310801 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310834 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310868 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.310887 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.314195 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6206351-05ef-4835-9e76-0002c5eca516-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.314579 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.314627 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.314703 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-config-data-custom\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.314739 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czr8h\" (UniqueName: \"kubernetes.io/projected/b6206351-05ef-4835-9e76-0002c5eca516-kube-api-access-czr8h\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.321441 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6206351-05ef-4835-9e76-0002c5eca516-logs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.323788 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6206351-05ef-4835-9e76-0002c5eca516-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.329724 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-scripts\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.365337 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.365514 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.367264 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-config-data-custom\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.385457 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-config-data\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.410700 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6206351-05ef-4835-9e76-0002c5eca516-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.411191 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czr8h\" (UniqueName: \"kubernetes.io/projected/b6206351-05ef-4835-9e76-0002c5eca516-kube-api-access-czr8h\") pod \"cinder-api-0\" (UID: \"b6206351-05ef-4835-9e76-0002c5eca516\") " pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442308 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442363 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442544 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442576 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442640 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pvlm\" (UniqueName: \"kubernetes.io/projected/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-kube-api-access-6pvlm\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442742 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442807 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.442826 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-logs\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.443370 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-logs\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.456818 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.457889 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.458277 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.458297 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f94b1d52da0ffd2f3c454948afdab6168cc98b014f61ee88f5256307d60f00b0/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.460770 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.488194 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.488254 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.508921 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pvlm\" (UniqueName: \"kubernetes.io/projected/b4741b20-74ae-4c1d-b8ea-f9d0579f0b20-kube-api-access-6pvlm\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.650486 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f1d2332-2b77-46f5-afc1-6bfd207b5c57\") pod \"glance-default-internal-api-0\" (UID: \"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20\") " pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.654692 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.813976 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.822124 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.839366 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.870711 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.872898 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.876448 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.876629 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.903313 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.947027 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d7697fc9-j6p2f" event={"ID":"42341a11-18a3-4ef1-8e12-a9450f68370e","Type":"ContainerStarted","Data":"0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.948462 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961098 4740 generic.go:334] "Generic (PLEG): container finished" podID="a2ace04b-27fb-40e6-9cee-6ab7d742cff7" containerID="75ed23573953c0c6bdd2f2884fa5a323a3699a6921bd1ca83e8cf4e76851b2c1" exitCode=0 Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961179 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9fb66bc8-jrdhh" event={"ID":"a2ace04b-27fb-40e6-9cee-6ab7d742cff7","Type":"ContainerDied","Data":"75ed23573953c0c6bdd2f2884fa5a323a3699a6921bd1ca83e8cf4e76851b2c1"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961796 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961894 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpggt\" (UniqueName: \"kubernetes.io/projected/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-kube-api-access-gpggt\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961917 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-logs\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961961 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.961983 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.962018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.962075 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:18.962137 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.003773 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-d7697fc9-j6p2f" podStartSLOduration=10.003751575 podStartE2EDuration="10.003751575s" podCreationTimestamp="2026-01-05 14:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:18.973818001 +0000 UTC m=+1268.280726570" watchObservedRunningTime="2026-01-05 14:10:19.003751575 +0000 UTC m=+1268.310660174" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.013762 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e0a645-f5dd-4e8f-a166-310bec2301cd" path="/var/lib/kubelet/pods/b8e0a645-f5dd-4e8f-a166-310bec2301cd/volumes" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.014696 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf33de39-642c-4ea4-8b3b-e505d43016f8" path="/var/lib/kubelet/pods/cf33de39-642c-4ea4-8b3b-e505d43016f8/volumes" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.019887 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe7c040-25ec-49bc-96a4-127e4281ae18" path="/var/lib/kubelet/pods/cfe7c040-25ec-49bc-96a4-127e4281ae18/volumes" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.020973 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdaebe5-8250-4c27-a69a-3b1ebd335a48" path="/var/lib/kubelet/pods/ecdaebe5-8250-4c27-a69a-3b1ebd335a48/volumes" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.027341 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9hbx9" event={"ID":"cadffc7b-e58f-4f7e-b312-d477c2ad9429","Type":"ContainerStarted","Data":"f86ec2eacc55cd980b95413035926bf3444ec7babc34d7e19e3ab944ca3a3b19"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.036300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerStarted","Data":"80df1cf6fafbbe4dc0ec51bd013b4df38a7b7011f9679813bd212171f4c949b4"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.042163 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5490-account-create-update-mq5tj" event={"ID":"0b8e97e5-739e-4c24-9768-e6a1fb56bace","Type":"ContainerStarted","Data":"563298a32eaa9303942e288c2f15a5f6b1f89190f99954aca44553be808a3ace"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.050156 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8669649c9d-h45rr" event={"ID":"116ff769-c12d-42af-93a9-549001762acd","Type":"ContainerStarted","Data":"947af3699a9aaa62c31e1417fbd983e9c7c29c1da4aa8e8da89101bd7d06e3ec"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.059640 4740 generic.go:334] "Generic (PLEG): container finished" podID="a6785a14-5ddd-471f-ac7d-eed48bfacd44" containerID="5086cd772cc74834eb4e8205b880c36e422e72de4ffb653d932b87b19b168239" exitCode=0 Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.059703 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4928-account-create-update-h89c7" event={"ID":"a6785a14-5ddd-471f-ac7d-eed48bfacd44","Type":"ContainerDied","Data":"5086cd772cc74834eb4e8205b880c36e422e72de4ffb653d932b87b19b168239"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.064840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.064921 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpggt\" (UniqueName: \"kubernetes.io/projected/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-kube-api-access-gpggt\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.064941 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-logs\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.064998 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.065019 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.065058 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.065116 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.065177 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.072853 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-logs\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.074544 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.075743 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" event={"ID":"49f587f8-8537-4ed2-a04f-715a8ea4781c","Type":"ContainerStarted","Data":"433487d84e3f089c15db82dcfd26c3e2e3925e585e4f94e5013bf6d83d76500b"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.103933 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z6tgb" event={"ID":"386101d7-0cc9-471d-8fe8-95af3e74a121","Type":"ContainerStarted","Data":"d7b0f572967fa5a1e966c5ad03ef97a5710f1d724b99b2daf6573480d2b26102"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.104241 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.104279 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f181c6c175c062630ff3c011c7f7b401c107aabb3ab99d5fa4b9467f0f8d5d6/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.106526 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pgxvc" event={"ID":"2ef0a3cf-3805-44a3-9c13-601726874901","Type":"ContainerStarted","Data":"502eb102d036b04695eee49ae09ecad5103180b9bc1e1f2e195c43e66670a9e4"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.108602 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpggt\" (UniqueName: \"kubernetes.io/projected/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-kube-api-access-gpggt\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.121972 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-599d7559df-v72j9" event={"ID":"49f975a3-3b3b-4dcf-9199-ec3acf657062","Type":"ContainerStarted","Data":"e07faf3d616aeb2305305b1c061139bb0523b27cd8252b9db0f867274310582f"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.137232 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" event={"ID":"e21239f6-f726-4394-b6fe-f7f7f438d7b5","Type":"ContainerStarted","Data":"ac86b8bb7b7f5ab0da899d8fc14c0d261246b98bf618c7a204bf60d3aebd5dd5"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.141597 4740 generic.go:334] "Generic (PLEG): container finished" podID="9c500176-40a9-42bd-83f6-1dc0df20484b" containerID="b8fb83913b2a96a66327269f7aea014b20db1841f90afc12ae327e8d5ac758d3" exitCode=0 Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.141666 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c8776b56d-szfht" event={"ID":"9c500176-40a9-42bd-83f6-1dc0df20484b","Type":"ContainerDied","Data":"b8fb83913b2a96a66327269f7aea014b20db1841f90afc12ae327e8d5ac758d3"} Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.146666 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.149128 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.155392 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.159223 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4ba45a8-a488-4b50-8d75-13a2e63dbac8-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.159729 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" podStartSLOduration=9.159717562 podStartE2EDuration="9.159717562s" podCreationTimestamp="2026-01-05 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:19.15069562 +0000 UTC m=+1268.457604189" watchObservedRunningTime="2026-01-05 14:10:19.159717562 +0000 UTC m=+1268.466626141" Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.682925 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 05 14:10:19 crc kubenswrapper[4740]: I0105 14:10:19.779811 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.013633 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d700616a-0d2c-4431-9ee8-deadd93dbdba\") pod \"glance-default-external-api-0\" (UID: \"f4ba45a8-a488-4b50-8d75-13a2e63dbac8\") " pod="openstack/glance-default-external-api-0" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.121092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.167104 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pgxvc" event={"ID":"2ef0a3cf-3805-44a3-9c13-601726874901","Type":"ContainerDied","Data":"0a28a0e3d6c7e4ed890c9f0ebacacae40f2aea3bd5e997a9c096805e6bd1464a"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.166676 4740 generic.go:334] "Generic (PLEG): container finished" podID="2ef0a3cf-3805-44a3-9c13-601726874901" containerID="0a28a0e3d6c7e4ed890c9f0ebacacae40f2aea3bd5e997a9c096805e6bd1464a" exitCode=0 Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.175195 4740 generic.go:334] "Generic (PLEG): container finished" podID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerID="b2fa4b4451d6a801afdc4dfa7c7a871ce9bd11ed2a5f885d7ec373519195cfad" exitCode=1 Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.175266 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-599d7559df-v72j9" event={"ID":"49f975a3-3b3b-4dcf-9199-ec3acf657062","Type":"ContainerDied","Data":"b2fa4b4451d6a801afdc4dfa7c7a871ce9bd11ed2a5f885d7ec373519195cfad"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.176013 4740 scope.go:117] "RemoveContainer" containerID="b2fa4b4451d6a801afdc4dfa7c7a871ce9bd11ed2a5f885d7ec373519195cfad" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.214882 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8669649c9d-h45rr" event={"ID":"116ff769-c12d-42af-93a9-549001762acd","Type":"ContainerStarted","Data":"2fc4036255b3f20c57469e0a421b0befb18910d0ca48c27c57bdb0b94e427c6d"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.215688 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.289354 4740 generic.go:334] "Generic (PLEG): container finished" podID="cadffc7b-e58f-4f7e-b312-d477c2ad9429" containerID="a0299e56f0e20b1923f748e3cc6051eddfbc969c7a41b8c3b26e0b30dce6a0b1" exitCode=0 Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.289629 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9hbx9" event={"ID":"cadffc7b-e58f-4f7e-b312-d477c2ad9429","Type":"ContainerDied","Data":"a0299e56f0e20b1923f748e3cc6051eddfbc969c7a41b8c3b26e0b30dce6a0b1"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.293903 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b6206351-05ef-4835-9e76-0002c5eca516","Type":"ContainerStarted","Data":"66e396d5f4b8000f1496772f942713be487fdda92af868a5814b053024925883"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.314638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c8776b56d-szfht" event={"ID":"9c500176-40a9-42bd-83f6-1dc0df20484b","Type":"ContainerDied","Data":"d3888f80012cf379466471fba100ebd63f873d82420d7d63f94a9ca3166520a1"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.314674 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3888f80012cf379466471fba100ebd63f873d82420d7d63f94a9ca3166520a1" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.328373 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" event={"ID":"9307c633-b44c-4f4d-8414-9f28826c30dc","Type":"ContainerStarted","Data":"302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.329173 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.357785 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5490-account-create-update-mq5tj" event={"ID":"0b8e97e5-739e-4c24-9768-e6a1fb56bace","Type":"ContainerStarted","Data":"c0afb2f6f3a91e38cbf0e574ec430518cd4fb41ef4c26143e727965aa21a7f8e"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.387557 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-8669649c9d-h45rr" podStartSLOduration=8.387534227 podStartE2EDuration="8.387534227s" podCreationTimestamp="2026-01-05 14:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:20.256491969 +0000 UTC m=+1269.563400558" watchObservedRunningTime="2026-01-05 14:10:20.387534227 +0000 UTC m=+1269.694442806" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.434226 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20","Type":"ContainerStarted","Data":"d2ab5d735ba85fe0d7d85493cc78f3c72477c985434c87f12125b3c927984a69"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.471571 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9fb66bc8-jrdhh" event={"ID":"a2ace04b-27fb-40e6-9cee-6ab7d742cff7","Type":"ContainerDied","Data":"f8217e03f88b13e3b3bc11620eb69f43e2862cfe334d3e6ca1606faa6c446e14"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.471619 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8217e03f88b13e3b3bc11620eb69f43e2862cfe334d3e6ca1606faa6c446e14" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.484125 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d5f88dcf-flfmp" event={"ID":"981fd275-b6a0-4023-b7a0-17427043fab4","Type":"ContainerStarted","Data":"61e9ee2f7c5418db7f46ec26723402f1ad1c2c5f536c36c608ecc511dba7b83f"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.484211 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" podStartSLOduration=8.484193562 podStartE2EDuration="8.484193562s" podCreationTimestamp="2026-01-05 14:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:20.355587789 +0000 UTC m=+1269.662496368" watchObservedRunningTime="2026-01-05 14:10:20.484193562 +0000 UTC m=+1269.791102141" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.489594 4740 generic.go:334] "Generic (PLEG): container finished" podID="e21239f6-f726-4394-b6fe-f7f7f438d7b5" containerID="ac86b8bb7b7f5ab0da899d8fc14c0d261246b98bf618c7a204bf60d3aebd5dd5" exitCode=0 Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.489671 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" event={"ID":"e21239f6-f726-4394-b6fe-f7f7f438d7b5","Type":"ContainerDied","Data":"ac86b8bb7b7f5ab0da899d8fc14c0d261246b98bf618c7a204bf60d3aebd5dd5"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.505988 4740 generic.go:334] "Generic (PLEG): container finished" podID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerID="34ac07e67afcca527c3a2fb3082a84dfcab16672f80fd46f5d601aaa666cc949" exitCode=1 Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.507215 4740 scope.go:117] "RemoveContainer" containerID="34ac07e67afcca527c3a2fb3082a84dfcab16672f80fd46f5d601aaa666cc949" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.507593 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" event={"ID":"49f587f8-8537-4ed2-a04f-715a8ea4781c","Type":"ContainerDied","Data":"34ac07e67afcca527c3a2fb3082a84dfcab16672f80fd46f5d601aaa666cc949"} Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.551710 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5490-account-create-update-mq5tj" podStartSLOduration=10.551689894 podStartE2EDuration="10.551689894s" podCreationTimestamp="2026-01-05 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:20.385081571 +0000 UTC m=+1269.691990160" watchObservedRunningTime="2026-01-05 14:10:20.551689894 +0000 UTC m=+1269.858598473" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.610885 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.615747 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.643973 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48j9l\" (UniqueName: \"kubernetes.io/projected/9c500176-40a9-42bd-83f6-1dc0df20484b-kube-api-access-48j9l\") pod \"9c500176-40a9-42bd-83f6-1dc0df20484b\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data\") pod \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644393 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-combined-ca-bundle\") pod \"9c500176-40a9-42bd-83f6-1dc0df20484b\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644461 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2mq\" (UniqueName: \"kubernetes.io/projected/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-kube-api-access-wr2mq\") pod \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data\") pod \"9c500176-40a9-42bd-83f6-1dc0df20484b\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644508 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data-custom\") pod \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644705 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-combined-ca-bundle\") pod \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\" (UID: \"a2ace04b-27fb-40e6-9cee-6ab7d742cff7\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.644822 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data-custom\") pod \"9c500176-40a9-42bd-83f6-1dc0df20484b\" (UID: \"9c500176-40a9-42bd-83f6-1dc0df20484b\") " Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.661407 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9c500176-40a9-42bd-83f6-1dc0df20484b" (UID: "9c500176-40a9-42bd-83f6-1dc0df20484b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.662451 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c500176-40a9-42bd-83f6-1dc0df20484b-kube-api-access-48j9l" (OuterVolumeSpecName: "kube-api-access-48j9l") pod "9c500176-40a9-42bd-83f6-1dc0df20484b" (UID: "9c500176-40a9-42bd-83f6-1dc0df20484b"). InnerVolumeSpecName "kube-api-access-48j9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.663324 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-kube-api-access-wr2mq" (OuterVolumeSpecName: "kube-api-access-wr2mq") pod "a2ace04b-27fb-40e6-9cee-6ab7d742cff7" (UID: "a2ace04b-27fb-40e6-9cee-6ab7d742cff7"). InnerVolumeSpecName "kube-api-access-wr2mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.678831 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a2ace04b-27fb-40e6-9cee-6ab7d742cff7" (UID: "a2ace04b-27fb-40e6-9cee-6ab7d742cff7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.756299 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.756335 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48j9l\" (UniqueName: \"kubernetes.io/projected/9c500176-40a9-42bd-83f6-1dc0df20484b-kube-api-access-48j9l\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.756347 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2mq\" (UniqueName: \"kubernetes.io/projected/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-kube-api-access-wr2mq\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.756356 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:20 crc kubenswrapper[4740]: I0105 14:10:20.856282 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.142400 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2ace04b-27fb-40e6-9cee-6ab7d742cff7" (UID: "a2ace04b-27fb-40e6-9cee-6ab7d742cff7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.169890 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.185094 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c500176-40a9-42bd-83f6-1dc0df20484b" (UID: "9c500176-40a9-42bd-83f6-1dc0df20484b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.271829 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.320327 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data" (OuterVolumeSpecName: "config-data") pod "a2ace04b-27fb-40e6-9cee-6ab7d742cff7" (UID: "a2ace04b-27fb-40e6-9cee-6ab7d742cff7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.351264 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data" (OuterVolumeSpecName: "config-data") pod "9c500176-40a9-42bd-83f6-1dc0df20484b" (UID: "9c500176-40a9-42bd-83f6-1dc0df20484b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.373966 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c500176-40a9-42bd-83f6-1dc0df20484b-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.374016 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2ace04b-27fb-40e6-9cee-6ab7d742cff7-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.549933 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4928-account-create-update-h89c7" event={"ID":"a6785a14-5ddd-471f-ac7d-eed48bfacd44","Type":"ContainerDied","Data":"f27e37cb419074e8525d893406715a9209485faa8f77626f7b27bbb950dc2516"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.550279 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27e37cb419074e8525d893406715a9209485faa8f77626f7b27bbb950dc2516" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.563133 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerStarted","Data":"88f62be89765f10912f337de8c8107db6100c8e8d0f0dae9766e21cb9fa6410b"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.564536 4740 generic.go:334] "Generic (PLEG): container finished" podID="0b8e97e5-739e-4c24-9768-e6a1fb56bace" containerID="c0afb2f6f3a91e38cbf0e574ec430518cd4fb41ef4c26143e727965aa21a7f8e" exitCode=0 Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.564595 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5490-account-create-update-mq5tj" event={"ID":"0b8e97e5-739e-4c24-9768-e6a1fb56bace","Type":"ContainerDied","Data":"c0afb2f6f3a91e38cbf0e574ec430518cd4fb41ef4c26143e727965aa21a7f8e"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.579301 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z6tgb" event={"ID":"386101d7-0cc9-471d-8fe8-95af3e74a121","Type":"ContainerStarted","Data":"c83059239fd95d166b17faaba84439fae07290c974d4b57ab0b59c49c0438980"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.595499 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-599d7559df-v72j9" event={"ID":"49f975a3-3b3b-4dcf-9199-ec3acf657062","Type":"ContainerStarted","Data":"756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.595759 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.611484 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4ba45a8-a488-4b50-8d75-13a2e63dbac8","Type":"ContainerStarted","Data":"7e2003336ed2a682b77afb713fca9481c16bc627af1b89b96b55bf3bacfd9d1f"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.614426 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.619616 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-z6tgb" podStartSLOduration=11.619593976 podStartE2EDuration="11.619593976s" podCreationTimestamp="2026-01-05 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:21.603976386 +0000 UTC m=+1270.910884965" watchObservedRunningTime="2026-01-05 14:10:21.619593976 +0000 UTC m=+1270.926502555" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.629371 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d5f88dcf-flfmp" event={"ID":"981fd275-b6a0-4023-b7a0-17427043fab4","Type":"ContainerStarted","Data":"e47889fa3a188f7768b5b65742d3b0a1ddb6f92ac9ad7fc9f85d50bb41f2a40b"} Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.629432 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.630510 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c8776b56d-szfht" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.630863 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9fb66bc8-jrdhh" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.633159 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.636322 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-599d7559df-v72j9" podStartSLOduration=12.636308265 podStartE2EDuration="12.636308265s" podCreationTimestamp="2026-01-05 14:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:21.62271606 +0000 UTC m=+1270.929624639" watchObservedRunningTime="2026-01-05 14:10:21.636308265 +0000 UTC m=+1270.943216844" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.681084 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6785a14-5ddd-471f-ac7d-eed48bfacd44-operator-scripts\") pod \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.685097 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7d5f88dcf-flfmp" podStartSLOduration=16.685083174 podStartE2EDuration="16.685083174s" podCreationTimestamp="2026-01-05 14:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:21.682473494 +0000 UTC m=+1270.989382073" watchObservedRunningTime="2026-01-05 14:10:21.685083174 +0000 UTC m=+1270.991991753" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.688358 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6785a14-5ddd-471f-ac7d-eed48bfacd44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6785a14-5ddd-471f-ac7d-eed48bfacd44" (UID: "a6785a14-5ddd-471f-ac7d-eed48bfacd44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.688557 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wt5b\" (UniqueName: \"kubernetes.io/projected/a6785a14-5ddd-471f-ac7d-eed48bfacd44-kube-api-access-4wt5b\") pod \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\" (UID: \"a6785a14-5ddd-471f-ac7d-eed48bfacd44\") " Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.689722 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6785a14-5ddd-471f-ac7d-eed48bfacd44-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.723254 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6785a14-5ddd-471f-ac7d-eed48bfacd44-kube-api-access-4wt5b" (OuterVolumeSpecName: "kube-api-access-4wt5b") pod "a6785a14-5ddd-471f-ac7d-eed48bfacd44" (UID: "a6785a14-5ddd-471f-ac7d-eed48bfacd44"). InnerVolumeSpecName "kube-api-access-4wt5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.794563 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wt5b\" (UniqueName: \"kubernetes.io/projected/a6785a14-5ddd-471f-ac7d-eed48bfacd44-kube-api-access-4wt5b\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:21 crc kubenswrapper[4740]: I0105 14:10:21.816996 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:22 crc kubenswrapper[4740]: I0105 14:10:22.056957 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c8776b56d-szfht"] Jan 05 14:10:22 crc kubenswrapper[4740]: I0105 14:10:22.121802 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6c8776b56d-szfht"] Jan 05 14:10:22 crc kubenswrapper[4740]: I0105 14:10:22.149256 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d9fb66bc8-jrdhh"] Jan 05 14:10:22 crc kubenswrapper[4740]: I0105 14:10:22.155165 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-d9fb66bc8-jrdhh"] Jan 05 14:10:23 crc kubenswrapper[4740]: I0105 14:10:23.593758 4740 generic.go:334] "Generic (PLEG): container finished" podID="386101d7-0cc9-471d-8fe8-95af3e74a121" containerID="c83059239fd95d166b17faaba84439fae07290c974d4b57ab0b59c49c0438980" exitCode=0 Jan 05 14:10:23 crc kubenswrapper[4740]: I0105 14:10:23.607128 4740 generic.go:334] "Generic (PLEG): container finished" podID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerID="756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc" exitCode=1 Jan 05 14:10:23 crc kubenswrapper[4740]: I0105 14:10:23.607769 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4928-account-create-update-h89c7" Jan 05 14:10:23 crc kubenswrapper[4740]: I0105 14:10:23.765741 4740 scope.go:117] "RemoveContainer" containerID="756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc" Jan 05 14:10:23 crc kubenswrapper[4740]: E0105 14:10:23.766105 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-599d7559df-v72j9_openstack(49f975a3-3b3b-4dcf-9199-ec3acf657062)\"" pod="openstack/heat-api-599d7559df-v72j9" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.038285 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c500176-40a9-42bd-83f6-1dc0df20484b" path="/var/lib/kubelet/pods/9c500176-40a9-42bd-83f6-1dc0df20484b/volumes" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.039184 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ace04b-27fb-40e6-9cee-6ab7d742cff7" path="/var/lib/kubelet/pods/a2ace04b-27fb-40e6-9cee-6ab7d742cff7/volumes" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.040604 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z6tgb" event={"ID":"386101d7-0cc9-471d-8fe8-95af3e74a121","Type":"ContainerDied","Data":"c83059239fd95d166b17faaba84439fae07290c974d4b57ab0b59c49c0438980"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.040632 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-599d7559df-v72j9" event={"ID":"49f975a3-3b3b-4dcf-9199-ec3acf657062","Type":"ContainerDied","Data":"756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.040664 4740 scope.go:117] "RemoveContainer" containerID="b2fa4b4451d6a801afdc4dfa7c7a871ce9bd11ed2a5f885d7ec373519195cfad" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.362208 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.363500 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.409470 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.419372 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.431102 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef0a3cf-3805-44a3-9c13-601726874901-operator-scripts\") pod \"2ef0a3cf-3805-44a3-9c13-601726874901\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.431182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cadffc7b-e58f-4f7e-b312-d477c2ad9429-operator-scripts\") pod \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.431215 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98s5g\" (UniqueName: \"kubernetes.io/projected/cadffc7b-e58f-4f7e-b312-d477c2ad9429-kube-api-access-98s5g\") pod \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\" (UID: \"cadffc7b-e58f-4f7e-b312-d477c2ad9429\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.431333 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb5h2\" (UniqueName: \"kubernetes.io/projected/2ef0a3cf-3805-44a3-9c13-601726874901-kube-api-access-wb5h2\") pod \"2ef0a3cf-3805-44a3-9c13-601726874901\" (UID: \"2ef0a3cf-3805-44a3-9c13-601726874901\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.432488 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cadffc7b-e58f-4f7e-b312-d477c2ad9429-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cadffc7b-e58f-4f7e-b312-d477c2ad9429" (UID: "cadffc7b-e58f-4f7e-b312-d477c2ad9429"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.433200 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef0a3cf-3805-44a3-9c13-601726874901-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ef0a3cf-3805-44a3-9c13-601726874901" (UID: "2ef0a3cf-3805-44a3-9c13-601726874901"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.445593 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef0a3cf-3805-44a3-9c13-601726874901-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.445619 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cadffc7b-e58f-4f7e-b312-d477c2ad9429-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.468976 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cadffc7b-e58f-4f7e-b312-d477c2ad9429-kube-api-access-98s5g" (OuterVolumeSpecName: "kube-api-access-98s5g") pod "cadffc7b-e58f-4f7e-b312-d477c2ad9429" (UID: "cadffc7b-e58f-4f7e-b312-d477c2ad9429"). InnerVolumeSpecName "kube-api-access-98s5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.469083 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef0a3cf-3805-44a3-9c13-601726874901-kube-api-access-wb5h2" (OuterVolumeSpecName: "kube-api-access-wb5h2") pod "2ef0a3cf-3805-44a3-9c13-601726874901" (UID: "2ef0a3cf-3805-44a3-9c13-601726874901"). InnerVolumeSpecName "kube-api-access-wb5h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.547043 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krtbd\" (UniqueName: \"kubernetes.io/projected/e21239f6-f726-4394-b6fe-f7f7f438d7b5-kube-api-access-krtbd\") pod \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.547113 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csp68\" (UniqueName: \"kubernetes.io/projected/0b8e97e5-739e-4c24-9768-e6a1fb56bace-kube-api-access-csp68\") pod \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.547163 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8e97e5-739e-4c24-9768-e6a1fb56bace-operator-scripts\") pod \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\" (UID: \"0b8e97e5-739e-4c24-9768-e6a1fb56bace\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.547267 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21239f6-f726-4394-b6fe-f7f7f438d7b5-operator-scripts\") pod \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\" (UID: \"e21239f6-f726-4394-b6fe-f7f7f438d7b5\") " Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.548137 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b8e97e5-739e-4c24-9768-e6a1fb56bace-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b8e97e5-739e-4c24-9768-e6a1fb56bace" (UID: "0b8e97e5-739e-4c24-9768-e6a1fb56bace"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.548648 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21239f6-f726-4394-b6fe-f7f7f438d7b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e21239f6-f726-4394-b6fe-f7f7f438d7b5" (UID: "e21239f6-f726-4394-b6fe-f7f7f438d7b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.549429 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8e97e5-739e-4c24-9768-e6a1fb56bace-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.549447 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb5h2\" (UniqueName: \"kubernetes.io/projected/2ef0a3cf-3805-44a3-9c13-601726874901-kube-api-access-wb5h2\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.549477 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e21239f6-f726-4394-b6fe-f7f7f438d7b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.549491 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98s5g\" (UniqueName: \"kubernetes.io/projected/cadffc7b-e58f-4f7e-b312-d477c2ad9429-kube-api-access-98s5g\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.554544 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21239f6-f726-4394-b6fe-f7f7f438d7b5-kube-api-access-krtbd" (OuterVolumeSpecName: "kube-api-access-krtbd") pod "e21239f6-f726-4394-b6fe-f7f7f438d7b5" (UID: "e21239f6-f726-4394-b6fe-f7f7f438d7b5"). InnerVolumeSpecName "kube-api-access-krtbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.554645 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8e97e5-739e-4c24-9768-e6a1fb56bace-kube-api-access-csp68" (OuterVolumeSpecName: "kube-api-access-csp68") pod "0b8e97e5-739e-4c24-9768-e6a1fb56bace" (UID: "0b8e97e5-739e-4c24-9768-e6a1fb56bace"). InnerVolumeSpecName "kube-api-access-csp68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.630030 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b6206351-05ef-4835-9e76-0002c5eca516","Type":"ContainerStarted","Data":"ee05900515f6c6feead21a0e20b073fbc416a35c7fd5d8fd99a068fbc7634984"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.634129 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerStarted","Data":"0bc6df6cbffcdcdb01ead8c52f5c895db645a6bed91152a322f7f465094402a6"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.637196 4740 generic.go:334] "Generic (PLEG): container finished" podID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerID="5a4fa887144ac1172bd70e0f001c9c5644e25b476e5e599aba817e56fdeb98f3" exitCode=1 Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.637240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" event={"ID":"49f587f8-8537-4ed2-a04f-715a8ea4781c","Type":"ContainerDied","Data":"5a4fa887144ac1172bd70e0f001c9c5644e25b476e5e599aba817e56fdeb98f3"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.637262 4740 scope.go:117] "RemoveContainer" containerID="34ac07e67afcca527c3a2fb3082a84dfcab16672f80fd46f5d601aaa666cc949" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.637919 4740 scope.go:117] "RemoveContainer" containerID="5a4fa887144ac1172bd70e0f001c9c5644e25b476e5e599aba817e56fdeb98f3" Jan 05 14:10:24 crc kubenswrapper[4740]: E0105 14:10:24.638276 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5879f5d6d6-rz6db_openstack(49f587f8-8537-4ed2-a04f-715a8ea4781c)\"" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.644962 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5490-account-create-update-mq5tj" event={"ID":"0b8e97e5-739e-4c24-9768-e6a1fb56bace","Type":"ContainerDied","Data":"563298a32eaa9303942e288c2f15a5f6b1f89190f99954aca44553be808a3ace"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.644997 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563298a32eaa9303942e288c2f15a5f6b1f89190f99954aca44553be808a3ace" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.645106 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5490-account-create-update-mq5tj" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.652423 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krtbd\" (UniqueName: \"kubernetes.io/projected/e21239f6-f726-4394-b6fe-f7f7f438d7b5-kube-api-access-krtbd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.652452 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csp68\" (UniqueName: \"kubernetes.io/projected/0b8e97e5-739e-4c24-9768-e6a1fb56bace-kube-api-access-csp68\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.656481 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20","Type":"ContainerStarted","Data":"27ad3339d3ed04362c63a2a37e40f3e9bbdb56cf746ef731c22fba93dd9a447e"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.659751 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pgxvc" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.659762 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pgxvc" event={"ID":"2ef0a3cf-3805-44a3-9c13-601726874901","Type":"ContainerDied","Data":"502eb102d036b04695eee49ae09ecad5103180b9bc1e1f2e195c43e66670a9e4"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.659804 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="502eb102d036b04695eee49ae09ecad5103180b9bc1e1f2e195c43e66670a9e4" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.663559 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9hbx9" event={"ID":"cadffc7b-e58f-4f7e-b312-d477c2ad9429","Type":"ContainerDied","Data":"f86ec2eacc55cd980b95413035926bf3444ec7babc34d7e19e3ab944ca3a3b19"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.663598 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86ec2eacc55cd980b95413035926bf3444ec7babc34d7e19e3ab944ca3a3b19" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.663655 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9hbx9" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.695598 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" event={"ID":"e21239f6-f726-4394-b6fe-f7f7f438d7b5","Type":"ContainerDied","Data":"670b7bf9a96529efb1ee2d8405b4f182e0a328c9564620546480eebf2f17b0f3"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.695650 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="670b7bf9a96529efb1ee2d8405b4f182e0a328c9564620546480eebf2f17b0f3" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.695617 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.758897 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4ba45a8-a488-4b50-8d75-13a2e63dbac8","Type":"ContainerStarted","Data":"bb22e525ea6258a1beb9384c35bb6f023d22c132dda442d781e319d08e1592b6"} Jan 05 14:10:24 crc kubenswrapper[4740]: I0105 14:10:24.841788 4740 scope.go:117] "RemoveContainer" containerID="756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc" Jan 05 14:10:24 crc kubenswrapper[4740]: E0105 14:10:24.842074 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-599d7559df-v72j9_openstack(49f975a3-3b3b-4dcf-9199-ec3acf657062)\"" pod="openstack/heat-api-599d7559df-v72j9" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.374672 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.374697 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.374706 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.475810 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.585918 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386101d7-0cc9-471d-8fe8-95af3e74a121-operator-scripts\") pod \"386101d7-0cc9-471d-8fe8-95af3e74a121\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.586154 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxmg\" (UniqueName: \"kubernetes.io/projected/386101d7-0cc9-471d-8fe8-95af3e74a121-kube-api-access-fnxmg\") pod \"386101d7-0cc9-471d-8fe8-95af3e74a121\" (UID: \"386101d7-0cc9-471d-8fe8-95af3e74a121\") " Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.589454 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386101d7-0cc9-471d-8fe8-95af3e74a121-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "386101d7-0cc9-471d-8fe8-95af3e74a121" (UID: "386101d7-0cc9-471d-8fe8-95af3e74a121"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.596278 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386101d7-0cc9-471d-8fe8-95af3e74a121-kube-api-access-fnxmg" (OuterVolumeSpecName: "kube-api-access-fnxmg") pod "386101d7-0cc9-471d-8fe8-95af3e74a121" (UID: "386101d7-0cc9-471d-8fe8-95af3e74a121"). InnerVolumeSpecName "kube-api-access-fnxmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.689077 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxmg\" (UniqueName: \"kubernetes.io/projected/386101d7-0cc9-471d-8fe8-95af3e74a121-kube-api-access-fnxmg\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.689334 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386101d7-0cc9-471d-8fe8-95af3e74a121-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.770551 4740 scope.go:117] "RemoveContainer" containerID="5a4fa887144ac1172bd70e0f001c9c5644e25b476e5e599aba817e56fdeb98f3" Jan 05 14:10:25 crc kubenswrapper[4740]: E0105 14:10:25.770885 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5879f5d6d6-rz6db_openstack(49f587f8-8537-4ed2-a04f-715a8ea4781c)\"" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.774183 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z6tgb" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.774237 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z6tgb" event={"ID":"386101d7-0cc9-471d-8fe8-95af3e74a121","Type":"ContainerDied","Data":"d7b0f572967fa5a1e966c5ad03ef97a5710f1d724b99b2daf6573480d2b26102"} Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.774278 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7b0f572967fa5a1e966c5ad03ef97a5710f1d724b99b2daf6573480d2b26102" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.774655 4740 scope.go:117] "RemoveContainer" containerID="756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc" Jan 05 14:10:25 crc kubenswrapper[4740]: E0105 14:10:25.774896 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-599d7559df-v72j9_openstack(49f975a3-3b3b-4dcf-9199-ec3acf657062)\"" pod="openstack/heat-api-599d7559df-v72j9" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" Jan 05 14:10:25 crc kubenswrapper[4740]: I0105 14:10:25.803868 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.784835 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4ba45a8-a488-4b50-8d75-13a2e63dbac8","Type":"ContainerStarted","Data":"e5291a7b80184f7b114ae3621c1afdc4b8b7c03133d08cbfd24f708dad546be1"} Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.786714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b6206351-05ef-4835-9e76-0002c5eca516","Type":"ContainerStarted","Data":"ecb2258eed0c25d91bf74937aecdbe3829f85e52fc8b6ad97b8274a7971bad2a"} Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.786837 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.789866 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerStarted","Data":"de30b53d9d1039b8468cacce7ca6165ceae6a79462a9bd20a14882e707c48058"} Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.791723 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b4741b20-74ae-4c1d-b8ea-f9d0579f0b20","Type":"ContainerStarted","Data":"c28263a2c699bed14e692d0c8b31b31f074a86507ee6bed1ff8bf64983b62e04"} Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.792220 4740 scope.go:117] "RemoveContainer" containerID="5a4fa887144ac1172bd70e0f001c9c5644e25b476e5e599aba817e56fdeb98f3" Jan 05 14:10:26 crc kubenswrapper[4740]: E0105 14:10:26.792441 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5879f5d6d6-rz6db_openstack(49f587f8-8537-4ed2-a04f-715a8ea4781c)\"" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.812281 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.812260899 podStartE2EDuration="8.812260899s" podCreationTimestamp="2026-01-05 14:10:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:26.802840376 +0000 UTC m=+1276.109748955" watchObservedRunningTime="2026-01-05 14:10:26.812260899 +0000 UTC m=+1276.119169478" Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.829531 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.829516932 podStartE2EDuration="9.829516932s" podCreationTimestamp="2026-01-05 14:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:26.823578372 +0000 UTC m=+1276.130486951" watchObservedRunningTime="2026-01-05 14:10:26.829516932 +0000 UTC m=+1276.136425511" Jan 05 14:10:26 crc kubenswrapper[4740]: I0105 14:10:26.851501 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.851478571 podStartE2EDuration="9.851478571s" podCreationTimestamp="2026-01-05 14:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:10:26.843772264 +0000 UTC m=+1276.150680843" watchObservedRunningTime="2026-01-05 14:10:26.851478571 +0000 UTC m=+1276.158387150" Jan 05 14:10:27 crc kubenswrapper[4740]: I0105 14:10:27.844141 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerStarted","Data":"7c4088a7363f13f885a5b43951e875b947affe6879e208b95287cc46117c91cc"} Jan 05 14:10:27 crc kubenswrapper[4740]: I0105 14:10:27.844739 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:10:27 crc kubenswrapper[4740]: I0105 14:10:27.916455 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.104721536 podStartE2EDuration="11.916433733s" podCreationTimestamp="2026-01-05 14:10:16 +0000 UTC" firstStartedPulling="2026-01-05 14:10:18.285827331 +0000 UTC m=+1267.592735910" lastFinishedPulling="2026-01-05 14:10:27.097539528 +0000 UTC m=+1276.404448107" observedRunningTime="2026-01-05 14:10:27.876532392 +0000 UTC m=+1277.183440971" watchObservedRunningTime="2026-01-05 14:10:27.916433733 +0000 UTC m=+1277.223342312" Jan 05 14:10:28 crc kubenswrapper[4740]: I0105 14:10:28.750147 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:28 crc kubenswrapper[4740]: I0105 14:10:28.814726 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:28 crc kubenswrapper[4740]: I0105 14:10:28.814776 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:28 crc kubenswrapper[4740]: I0105 14:10:28.889477 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:28 crc kubenswrapper[4740]: I0105 14:10:28.933775 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:29 crc kubenswrapper[4740]: I0105 14:10:29.868012 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="proxy-httpd" containerID="cri-o://7c4088a7363f13f885a5b43951e875b947affe6879e208b95287cc46117c91cc" gracePeriod=30 Jan 05 14:10:29 crc kubenswrapper[4740]: I0105 14:10:29.868090 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-notification-agent" containerID="cri-o://0bc6df6cbffcdcdb01ead8c52f5c895db645a6bed91152a322f7f465094402a6" gracePeriod=30 Jan 05 14:10:29 crc kubenswrapper[4740]: I0105 14:10:29.868056 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="sg-core" containerID="cri-o://de30b53d9d1039b8468cacce7ca6165ceae6a79462a9bd20a14882e707c48058" gracePeriod=30 Jan 05 14:10:29 crc kubenswrapper[4740]: I0105 14:10:29.868158 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:29 crc kubenswrapper[4740]: I0105 14:10:29.868475 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:29 crc kubenswrapper[4740]: I0105 14:10:29.868165 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-central-agent" containerID="cri-o://88f62be89765f10912f337de8c8107db6100c8e8d0f0dae9766e21cb9fa6410b" gracePeriod=30 Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.014036 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.121296 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7b9bfc84c-zlzbm"] Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.121495 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7b9bfc84c-zlzbm" podUID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" containerName="heat-engine" containerID="cri-o://3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" gracePeriod=60 Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.123133 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.123157 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.155586 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.185668 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.185794 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.259823 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-599d7559df-v72j9"] Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.336581 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.397437 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5879f5d6d6-rz6db"] Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.765736 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lsnkl"] Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766461 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21239f6-f726-4394-b6fe-f7f7f438d7b5" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766477 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21239f6-f726-4394-b6fe-f7f7f438d7b5" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766497 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ace04b-27fb-40e6-9cee-6ab7d742cff7" containerName="heat-api" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766503 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ace04b-27fb-40e6-9cee-6ab7d742cff7" containerName="heat-api" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766512 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386101d7-0cc9-471d-8fe8-95af3e74a121" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766518 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="386101d7-0cc9-471d-8fe8-95af3e74a121" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766535 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef0a3cf-3805-44a3-9c13-601726874901" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766541 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef0a3cf-3805-44a3-9c13-601726874901" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766569 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8e97e5-739e-4c24-9768-e6a1fb56bace" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766575 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8e97e5-739e-4c24-9768-e6a1fb56bace" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766586 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadffc7b-e58f-4f7e-b312-d477c2ad9429" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766592 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadffc7b-e58f-4f7e-b312-d477c2ad9429" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766603 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c500176-40a9-42bd-83f6-1dc0df20484b" containerName="heat-cfnapi" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766608 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c500176-40a9-42bd-83f6-1dc0df20484b" containerName="heat-cfnapi" Jan 05 14:10:30 crc kubenswrapper[4740]: E0105 14:10:30.766618 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6785a14-5ddd-471f-ac7d-eed48bfacd44" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766642 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6785a14-5ddd-471f-ac7d-eed48bfacd44" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766847 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6785a14-5ddd-471f-ac7d-eed48bfacd44" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766862 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadffc7b-e58f-4f7e-b312-d477c2ad9429" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766873 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ace04b-27fb-40e6-9cee-6ab7d742cff7" containerName="heat-api" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766887 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c500176-40a9-42bd-83f6-1dc0df20484b" containerName="heat-cfnapi" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766897 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21239f6-f726-4394-b6fe-f7f7f438d7b5" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766909 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="386101d7-0cc9-471d-8fe8-95af3e74a121" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766920 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef0a3cf-3805-44a3-9c13-601726874901" containerName="mariadb-database-create" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.766935 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8e97e5-739e-4c24-9768-e6a1fb56bace" containerName="mariadb-account-create-update" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.767819 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.782592 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.783246 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.783361 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-drccb" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.784243 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lsnkl"] Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.838464 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d5f88dcf-flfmp" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.922993 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-599d7559df-v72j9" event={"ID":"49f975a3-3b3b-4dcf-9199-ec3acf657062","Type":"ContainerDied","Data":"e07faf3d616aeb2305305b1c061139bb0523b27cd8252b9db0f867274310582f"} Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.923375 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e07faf3d616aeb2305305b1c061139bb0523b27cd8252b9db0f867274310582f" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.926960 4740 generic.go:334] "Generic (PLEG): container finished" podID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerID="7c4088a7363f13f885a5b43951e875b947affe6879e208b95287cc46117c91cc" exitCode=0 Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.927049 4740 generic.go:334] "Generic (PLEG): container finished" podID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerID="de30b53d9d1039b8468cacce7ca6165ceae6a79462a9bd20a14882e707c48058" exitCode=2 Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.927161 4740 generic.go:334] "Generic (PLEG): container finished" podID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerID="0bc6df6cbffcdcdb01ead8c52f5c895db645a6bed91152a322f7f465094402a6" exitCode=0 Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.927213 4740 generic.go:334] "Generic (PLEG): container finished" podID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerID="88f62be89765f10912f337de8c8107db6100c8e8d0f0dae9766e21cb9fa6410b" exitCode=0 Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.928674 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerDied","Data":"7c4088a7363f13f885a5b43951e875b947affe6879e208b95287cc46117c91cc"} Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.928722 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerDied","Data":"de30b53d9d1039b8468cacce7ca6165ceae6a79462a9bd20a14882e707c48058"} Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.928739 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerDied","Data":"0bc6df6cbffcdcdb01ead8c52f5c895db645a6bed91152a322f7f465094402a6"} Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.928782 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerDied","Data":"88f62be89765f10912f337de8c8107db6100c8e8d0f0dae9766e21cb9fa6410b"} Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.929453 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.929491 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.946800 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.952916 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.966470 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.966575 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whm9\" (UniqueName: \"kubernetes.io/projected/231fcf86-a13a-4dce-b991-609c503de1ec-kube-api-access-5whm9\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.966605 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-config-data\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:30 crc kubenswrapper[4740]: I0105 14:10:30.966688 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-scripts\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.078994 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data-custom\") pod \"49f587f8-8537-4ed2-a04f-715a8ea4781c\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079042 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data\") pod \"49f975a3-3b3b-4dcf-9199-ec3acf657062\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079138 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-combined-ca-bundle\") pod \"49f587f8-8537-4ed2-a04f-715a8ea4781c\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079177 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtb78\" (UniqueName: \"kubernetes.io/projected/49f587f8-8537-4ed2-a04f-715a8ea4781c-kube-api-access-rtb78\") pod \"49f587f8-8537-4ed2-a04f-715a8ea4781c\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079206 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data\") pod \"49f587f8-8537-4ed2-a04f-715a8ea4781c\" (UID: \"49f587f8-8537-4ed2-a04f-715a8ea4781c\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079220 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-combined-ca-bundle\") pod \"49f975a3-3b3b-4dcf-9199-ec3acf657062\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079288 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v768n\" (UniqueName: \"kubernetes.io/projected/49f975a3-3b3b-4dcf-9199-ec3acf657062-kube-api-access-v768n\") pod \"49f975a3-3b3b-4dcf-9199-ec3acf657062\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079438 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data-custom\") pod \"49f975a3-3b3b-4dcf-9199-ec3acf657062\" (UID: \"49f975a3-3b3b-4dcf-9199-ec3acf657062\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079818 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079963 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whm9\" (UniqueName: \"kubernetes.io/projected/231fcf86-a13a-4dce-b991-609c503de1ec-kube-api-access-5whm9\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.079993 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-config-data\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.080132 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-scripts\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.098414 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f587f8-8537-4ed2-a04f-715a8ea4781c-kube-api-access-rtb78" (OuterVolumeSpecName: "kube-api-access-rtb78") pod "49f587f8-8537-4ed2-a04f-715a8ea4781c" (UID: "49f587f8-8537-4ed2-a04f-715a8ea4781c"). InnerVolumeSpecName "kube-api-access-rtb78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.106176 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49f975a3-3b3b-4dcf-9199-ec3acf657062" (UID: "49f975a3-3b3b-4dcf-9199-ec3acf657062"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.106680 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-scripts\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.109174 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49f587f8-8537-4ed2-a04f-715a8ea4781c" (UID: "49f587f8-8537-4ed2-a04f-715a8ea4781c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.117554 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.121638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whm9\" (UniqueName: \"kubernetes.io/projected/231fcf86-a13a-4dce-b991-609c503de1ec-kube-api-access-5whm9\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.124842 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f975a3-3b3b-4dcf-9199-ec3acf657062-kube-api-access-v768n" (OuterVolumeSpecName: "kube-api-access-v768n") pod "49f975a3-3b3b-4dcf-9199-ec3acf657062" (UID: "49f975a3-3b3b-4dcf-9199-ec3acf657062"). InnerVolumeSpecName "kube-api-access-v768n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.125389 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-config-data\") pod \"nova-cell0-conductor-db-sync-lsnkl\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.184103 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v768n\" (UniqueName: \"kubernetes.io/projected/49f975a3-3b3b-4dcf-9199-ec3acf657062-kube-api-access-v768n\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.184148 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.184161 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.184176 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtb78\" (UniqueName: \"kubernetes.io/projected/49f587f8-8537-4ed2-a04f-715a8ea4781c-kube-api-access-rtb78\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.237653 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.238229 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49f975a3-3b3b-4dcf-9199-ec3acf657062" (UID: "49f975a3-3b3b-4dcf-9199-ec3acf657062"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.256263 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49f587f8-8537-4ed2-a04f-715a8ea4781c" (UID: "49f587f8-8537-4ed2-a04f-715a8ea4781c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.289890 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.289930 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.406610 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data" (OuterVolumeSpecName: "config-data") pod "49f975a3-3b3b-4dcf-9199-ec3acf657062" (UID: "49f975a3-3b3b-4dcf-9199-ec3acf657062"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.424271 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data" (OuterVolumeSpecName: "config-data") pod "49f587f8-8537-4ed2-a04f-715a8ea4781c" (UID: "49f587f8-8537-4ed2-a04f-715a8ea4781c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.444760 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498369 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-scripts\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498475 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9dp4\" (UniqueName: \"kubernetes.io/projected/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-kube-api-access-s9dp4\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498645 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-log-httpd\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498669 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-config-data\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498710 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-run-httpd\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498776 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-sg-core-conf-yaml\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.498863 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-combined-ca-bundle\") pod \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\" (UID: \"51ea1ba5-42ec-47b9-b385-f68f88ec89a9\") " Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.499512 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f587f8-8537-4ed2-a04f-715a8ea4781c-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.499525 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49f975a3-3b3b-4dcf-9199-ec3acf657062-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.503706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.504544 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-scripts" (OuterVolumeSpecName: "scripts") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.505827 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.514801 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-kube-api-access-s9dp4" (OuterVolumeSpecName: "kube-api-access-s9dp4") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "kube-api-access-s9dp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.582190 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.602113 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.602150 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.602164 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9dp4\" (UniqueName: \"kubernetes.io/projected/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-kube-api-access-s9dp4\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.602178 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.602189 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: E0105 14:10:31.619576 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 05 14:10:31 crc kubenswrapper[4740]: E0105 14:10:31.621876 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 05 14:10:31 crc kubenswrapper[4740]: E0105 14:10:31.625208 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 05 14:10:31 crc kubenswrapper[4740]: E0105 14:10:31.625349 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7b9bfc84c-zlzbm" podUID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" containerName="heat-engine" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.663323 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-config-data" (OuterVolumeSpecName: "config-data") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.673266 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51ea1ba5-42ec-47b9-b385-f68f88ec89a9" (UID: "51ea1ba5-42ec-47b9-b385-f68f88ec89a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.704871 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.704906 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ea1ba5-42ec-47b9-b385-f68f88ec89a9-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.916197 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.916261 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.916312 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.917233 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7164e8ec74a1f47d6179acf6f6c20f4c18a05f4cce20c982561062888c48c311"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.917298 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://7164e8ec74a1f47d6179acf6f6c20f4c18a05f4cce20c982561062888c48c311" gracePeriod=600 Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.944617 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" event={"ID":"49f587f8-8537-4ed2-a04f-715a8ea4781c","Type":"ContainerDied","Data":"433487d84e3f089c15db82dcfd26c3e2e3925e585e4f94e5013bf6d83d76500b"} Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.944665 4740 scope.go:117] "RemoveContainer" containerID="5a4fa887144ac1172bd70e0f001c9c5644e25b476e5e599aba817e56fdeb98f3" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.944756 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5879f5d6d6-rz6db" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.964965 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lsnkl"] Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.966614 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-599d7559df-v72j9" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.966678 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:31 crc kubenswrapper[4740]: I0105 14:10:31.966905 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51ea1ba5-42ec-47b9-b385-f68f88ec89a9","Type":"ContainerDied","Data":"80df1cf6fafbbe4dc0ec51bd013b4df38a7b7011f9679813bd212171f4c949b4"} Jan 05 14:10:31 crc kubenswrapper[4740]: W0105 14:10:31.968846 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231fcf86_a13a_4dce_b991_609c503de1ec.slice/crio-6ecc2aab20fe7a76f04a2372b863aa1da5f041e5687ba53faec4bf1b7e32ef79 WatchSource:0}: Error finding container 6ecc2aab20fe7a76f04a2372b863aa1da5f041e5687ba53faec4bf1b7e32ef79: Status 404 returned error can't find the container with id 6ecc2aab20fe7a76f04a2372b863aa1da5f041e5687ba53faec4bf1b7e32ef79 Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.007214 4740 scope.go:117] "RemoveContainer" containerID="7c4088a7363f13f885a5b43951e875b947affe6879e208b95287cc46117c91cc" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.167737 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5879f5d6d6-rz6db"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.173525 4740 scope.go:117] "RemoveContainer" containerID="de30b53d9d1039b8468cacce7ca6165ceae6a79462a9bd20a14882e707c48058" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.179834 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5879f5d6d6-rz6db"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.208645 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.215238 4740 scope.go:117] "RemoveContainer" containerID="0bc6df6cbffcdcdb01ead8c52f5c895db645a6bed91152a322f7f465094402a6" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.268817 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.279221 4740 scope.go:117] "RemoveContainer" containerID="88f62be89765f10912f337de8c8107db6100c8e8d0f0dae9766e21cb9fa6410b" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.294719 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-599d7559df-v72j9"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.309173 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-599d7559df-v72j9"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328110 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328662 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerName="heat-cfnapi" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328680 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerName="heat-cfnapi" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328699 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerName="heat-cfnapi" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328707 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerName="heat-cfnapi" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328722 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerName="heat-api" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328728 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerName="heat-api" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328752 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-central-agent" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328757 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-central-agent" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328773 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="proxy-httpd" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328779 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="proxy-httpd" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328806 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerName="heat-api" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328812 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerName="heat-api" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328831 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-notification-agent" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328837 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-notification-agent" Jan 05 14:10:32 crc kubenswrapper[4740]: E0105 14:10:32.328851 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="sg-core" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.328857 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="sg-core" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329101 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerName="heat-cfnapi" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329112 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" containerName="heat-cfnapi" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329122 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerName="heat-api" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329130 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-central-agent" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329138 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="ceilometer-notification-agent" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329151 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="sg-core" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329166 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" containerName="proxy-httpd" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.329177 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" containerName="heat-api" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.331610 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.334836 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.334997 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.340206 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437246 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-run-httpd\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437396 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-log-httpd\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437501 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6shgp\" (UniqueName: \"kubernetes.io/projected/392438d7-4751-48d7-b3ab-3e30200f5aa9-kube-api-access-6shgp\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437552 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-config-data\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437751 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.437798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-scripts\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.539590 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.539985 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-scripts\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.540011 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-run-httpd\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.540052 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-log-httpd\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.540089 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.540128 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6shgp\" (UniqueName: \"kubernetes.io/projected/392438d7-4751-48d7-b3ab-3e30200f5aa9-kube-api-access-6shgp\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.540161 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-config-data\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.541770 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-log-httpd\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.542338 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-run-httpd\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.548536 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.555489 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.556092 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-scripts\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.559863 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-config-data\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.565127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6shgp\" (UniqueName: \"kubernetes.io/projected/392438d7-4751-48d7-b3ab-3e30200f5aa9-kube-api-access-6shgp\") pod \"ceilometer-0\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.662748 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.783968 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.805744 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.998376 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f587f8-8537-4ed2-a04f-715a8ea4781c" path="/var/lib/kubelet/pods/49f587f8-8537-4ed2-a04f-715a8ea4781c/volumes" Jan 05 14:10:32 crc kubenswrapper[4740]: I0105 14:10:32.999379 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f975a3-3b3b-4dcf-9199-ec3acf657062" path="/var/lib/kubelet/pods/49f975a3-3b3b-4dcf-9199-ec3acf657062/volumes" Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.000097 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ea1ba5-42ec-47b9-b385-f68f88ec89a9" path="/var/lib/kubelet/pods/51ea1ba5-42ec-47b9-b385-f68f88ec89a9/volumes" Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.060036 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="7164e8ec74a1f47d6179acf6f6c20f4c18a05f4cce20c982561062888c48c311" exitCode=0 Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.060134 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"7164e8ec74a1f47d6179acf6f6c20f4c18a05f4cce20c982561062888c48c311"} Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.060423 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"bcada73fec747c8bb22d39df02fd140c2ecd4b0b0dc04e0085ad7f13dab4ab07"} Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.060445 4740 scope.go:117] "RemoveContainer" containerID="d1fef0aff31cd613b2d207d472e346207cc1735aecec60c1ba22825fb1d47f9d" Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.069210 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" event={"ID":"231fcf86-a13a-4dce-b991-609c503de1ec","Type":"ContainerStarted","Data":"6ecc2aab20fe7a76f04a2372b863aa1da5f041e5687ba53faec4bf1b7e32ef79"} Jan 05 14:10:33 crc kubenswrapper[4740]: I0105 14:10:33.511468 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:34 crc kubenswrapper[4740]: I0105 14:10:34.162400 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerStarted","Data":"bff9d1cd5e71d76b4eca03295c923cadece1f9bccf3ad0e430fb9fac8361814d"} Jan 05 14:10:34 crc kubenswrapper[4740]: I0105 14:10:34.915657 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 05 14:10:34 crc kubenswrapper[4740]: I0105 14:10:34.916077 4740 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 05 14:10:34 crc kubenswrapper[4740]: I0105 14:10:34.918956 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 05 14:10:35 crc kubenswrapper[4740]: I0105 14:10:35.175020 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerStarted","Data":"af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4"} Jan 05 14:10:36 crc kubenswrapper[4740]: I0105 14:10:36.190339 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerStarted","Data":"c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0"} Jan 05 14:10:36 crc kubenswrapper[4740]: I0105 14:10:36.517692 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 05 14:10:36 crc kubenswrapper[4740]: I0105 14:10:36.937981 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:37 crc kubenswrapper[4740]: I0105 14:10:37.212809 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerStarted","Data":"3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee"} Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.164506 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.225344 4740 generic.go:334] "Generic (PLEG): container finished" podID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" exitCode=0 Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.225387 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b9bfc84c-zlzbm" event={"ID":"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1","Type":"ContainerDied","Data":"3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee"} Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.225415 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7b9bfc84c-zlzbm" event={"ID":"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1","Type":"ContainerDied","Data":"26bacf4364cc15a34cb5970d252f51c3fecaaa0d28a361a83fff4142b512a12b"} Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.225432 4740 scope.go:117] "RemoveContainer" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.225566 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7b9bfc84c-zlzbm" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.243871 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-combined-ca-bundle\") pod \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.244039 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data\") pod \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.244079 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsx2h\" (UniqueName: \"kubernetes.io/projected/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-kube-api-access-bsx2h\") pod \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.244224 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data-custom\") pod \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\" (UID: \"3ac97e60-10b0-43fb-98d3-7feaa0a40fc1\") " Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.252835 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-kube-api-access-bsx2h" (OuterVolumeSpecName: "kube-api-access-bsx2h") pod "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" (UID: "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1"). InnerVolumeSpecName "kube-api-access-bsx2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.264413 4740 scope.go:117] "RemoveContainer" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" Jan 05 14:10:38 crc kubenswrapper[4740]: E0105 14:10:38.264862 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee\": container with ID starting with 3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee not found: ID does not exist" containerID="3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.264904 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee"} err="failed to get container status \"3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee\": rpc error: code = NotFound desc = could not find container \"3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee\": container with ID starting with 3ae2e0692092a754d462f90895015535a659a53214fa4c504b3775352136f7ee not found: ID does not exist" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.267297 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" (UID: "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.347036 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.347084 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsx2h\" (UniqueName: \"kubernetes.io/projected/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-kube-api-access-bsx2h\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.373481 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" (UID: "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.381197 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data" (OuterVolumeSpecName: "config-data") pod "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" (UID: "3ac97e60-10b0-43fb-98d3-7feaa0a40fc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.448921 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.449234 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.625716 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7b9bfc84c-zlzbm"] Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.638541 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7b9bfc84c-zlzbm"] Jan 05 14:10:38 crc kubenswrapper[4740]: I0105 14:10:38.981474 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" path="/var/lib/kubelet/pods/3ac97e60-10b0-43fb-98d3-7feaa0a40fc1/volumes" Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.245187 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerStarted","Data":"a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb"} Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.245709 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.245720 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-central-agent" containerID="cri-o://af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4" gracePeriod=30 Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.245784 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="sg-core" containerID="cri-o://3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee" gracePeriod=30 Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.245855 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-notification-agent" containerID="cri-o://c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0" gracePeriod=30 Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.245883 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="proxy-httpd" containerID="cri-o://a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb" gracePeriod=30 Jan 05 14:10:39 crc kubenswrapper[4740]: I0105 14:10:39.278069 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.913339378 podStartE2EDuration="7.278035341s" podCreationTimestamp="2026-01-05 14:10:32 +0000 UTC" firstStartedPulling="2026-01-05 14:10:33.510536955 +0000 UTC m=+1282.817445534" lastFinishedPulling="2026-01-05 14:10:37.875232918 +0000 UTC m=+1287.182141497" observedRunningTime="2026-01-05 14:10:39.268969098 +0000 UTC m=+1288.575877687" watchObservedRunningTime="2026-01-05 14:10:39.278035341 +0000 UTC m=+1288.584943910" Jan 05 14:10:39 crc kubenswrapper[4740]: E0105 14:10:39.934850 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392438d7_4751_48d7_b3ab_3e30200f5aa9.slice/crio-c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392438d7_4751_48d7_b3ab_3e30200f5aa9.slice/crio-conmon-c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:10:40 crc kubenswrapper[4740]: I0105 14:10:40.266406 4740 generic.go:334] "Generic (PLEG): container finished" podID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerID="a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb" exitCode=0 Jan 05 14:10:40 crc kubenswrapper[4740]: I0105 14:10:40.266661 4740 generic.go:334] "Generic (PLEG): container finished" podID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerID="3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee" exitCode=2 Jan 05 14:10:40 crc kubenswrapper[4740]: I0105 14:10:40.266478 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerDied","Data":"a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb"} Jan 05 14:10:40 crc kubenswrapper[4740]: I0105 14:10:40.266712 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerDied","Data":"3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee"} Jan 05 14:10:40 crc kubenswrapper[4740]: I0105 14:10:40.266726 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerDied","Data":"c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0"} Jan 05 14:10:40 crc kubenswrapper[4740]: I0105 14:10:40.266674 4740 generic.go:334] "Generic (PLEG): container finished" podID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerID="c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0" exitCode=0 Jan 05 14:10:46 crc kubenswrapper[4740]: I0105 14:10:46.340033 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" event={"ID":"231fcf86-a13a-4dce-b991-609c503de1ec","Type":"ContainerStarted","Data":"55e7651216719090d790e99574bd370d4cc43c269d7cae4fb327bf843c608baf"} Jan 05 14:10:46 crc kubenswrapper[4740]: I0105 14:10:46.361738 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" podStartSLOduration=2.5790309259999997 podStartE2EDuration="16.361716696s" podCreationTimestamp="2026-01-05 14:10:30 +0000 UTC" firstStartedPulling="2026-01-05 14:10:31.97653466 +0000 UTC m=+1281.283443239" lastFinishedPulling="2026-01-05 14:10:45.75922042 +0000 UTC m=+1295.066129009" observedRunningTime="2026-01-05 14:10:46.360613815 +0000 UTC m=+1295.667522394" watchObservedRunningTime="2026-01-05 14:10:46.361716696 +0000 UTC m=+1295.668625275" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.021769 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164214 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-scripts\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164328 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-run-httpd\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164367 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-config-data\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164523 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6shgp\" (UniqueName: \"kubernetes.io/projected/392438d7-4751-48d7-b3ab-3e30200f5aa9-kube-api-access-6shgp\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164561 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-combined-ca-bundle\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164684 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-sg-core-conf-yaml\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.164720 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-log-httpd\") pod \"392438d7-4751-48d7-b3ab-3e30200f5aa9\" (UID: \"392438d7-4751-48d7-b3ab-3e30200f5aa9\") " Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.165559 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.166087 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.166175 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.170563 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392438d7-4751-48d7-b3ab-3e30200f5aa9-kube-api-access-6shgp" (OuterVolumeSpecName: "kube-api-access-6shgp") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "kube-api-access-6shgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.177285 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-scripts" (OuterVolumeSpecName: "scripts") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.211357 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.268315 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.268348 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.268357 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392438d7-4751-48d7-b3ab-3e30200f5aa9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.268368 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6shgp\" (UniqueName: \"kubernetes.io/projected/392438d7-4751-48d7-b3ab-3e30200f5aa9-kube-api-access-6shgp\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.278564 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.303193 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-config-data" (OuterVolumeSpecName: "config-data") pod "392438d7-4751-48d7-b3ab-3e30200f5aa9" (UID: "392438d7-4751-48d7-b3ab-3e30200f5aa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.353218 4740 generic.go:334] "Generic (PLEG): container finished" podID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerID="af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4" exitCode=0 Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.353385 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerDied","Data":"af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4"} Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.353627 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392438d7-4751-48d7-b3ab-3e30200f5aa9","Type":"ContainerDied","Data":"bff9d1cd5e71d76b4eca03295c923cadece1f9bccf3ad0e430fb9fac8361814d"} Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.353657 4740 scope.go:117] "RemoveContainer" containerID="a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.353496 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.370637 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.370668 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392438d7-4751-48d7-b3ab-3e30200f5aa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.394574 4740 scope.go:117] "RemoveContainer" containerID="3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.402808 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.447350 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.469012 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.469820 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" containerName="heat-engine" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.469842 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" containerName="heat-engine" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.469858 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="sg-core" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.470102 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="sg-core" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.470207 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="proxy-httpd" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.470246 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="proxy-httpd" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.470345 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-notification-agent" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.470357 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-notification-agent" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.470511 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-central-agent" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.470522 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-central-agent" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.471279 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="sg-core" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.471339 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac97e60-10b0-43fb-98d3-7feaa0a40fc1" containerName="heat-engine" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.471368 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-central-agent" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.471385 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="ceilometer-notification-agent" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.471397 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" containerName="proxy-httpd" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.474859 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.476291 4740 scope.go:117] "RemoveContainer" containerID="c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.482969 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.483116 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.491107 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.516803 4740 scope.go:117] "RemoveContainer" containerID="af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.540407 4740 scope.go:117] "RemoveContainer" containerID="a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.540827 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb\": container with ID starting with a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb not found: ID does not exist" containerID="a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.540857 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb"} err="failed to get container status \"a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb\": rpc error: code = NotFound desc = could not find container \"a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb\": container with ID starting with a04ee01b8c13618e2456348d3d87da718a13ce80528d5473c071fd5acf3e4ebb not found: ID does not exist" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.540879 4740 scope.go:117] "RemoveContainer" containerID="3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.541326 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee\": container with ID starting with 3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee not found: ID does not exist" containerID="3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.541346 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee"} err="failed to get container status \"3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee\": rpc error: code = NotFound desc = could not find container \"3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee\": container with ID starting with 3e27cb0f95cb0802f98e73bdd7ab35929bd257778ad7bfbde4b098f458f9cbee not found: ID does not exist" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.541358 4740 scope.go:117] "RemoveContainer" containerID="c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.541616 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0\": container with ID starting with c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0 not found: ID does not exist" containerID="c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.541634 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0"} err="failed to get container status \"c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0\": rpc error: code = NotFound desc = could not find container \"c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0\": container with ID starting with c1ab6591c71ac148cb19cd0f5f699c07cfb4c5098873ef151a1d91390f9f7fe0 not found: ID does not exist" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.541645 4740 scope.go:117] "RemoveContainer" containerID="af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4" Jan 05 14:10:47 crc kubenswrapper[4740]: E0105 14:10:47.541964 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4\": container with ID starting with af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4 not found: ID does not exist" containerID="af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.541984 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4"} err="failed to get container status \"af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4\": rpc error: code = NotFound desc = could not find container \"af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4\": container with ID starting with af0ee91187a4418220013589832f46a5f3d3af24ab7491b29b94d92f90e7c8e4 not found: ID does not exist" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577616 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-config-data\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577688 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-scripts\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577717 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jxg\" (UniqueName: \"kubernetes.io/projected/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-kube-api-access-r4jxg\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577785 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577849 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-log-httpd\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577874 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.577907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-run-httpd\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680204 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680291 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-run-httpd\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680425 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-config-data\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680488 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-scripts\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680519 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jxg\" (UniqueName: \"kubernetes.io/projected/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-kube-api-access-r4jxg\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680555 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.680649 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-log-httpd\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.681232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-log-httpd\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.682505 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-run-httpd\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.692042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.693043 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-scripts\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.694740 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.698804 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jxg\" (UniqueName: \"kubernetes.io/projected/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-kube-api-access-r4jxg\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.703101 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-config-data\") pod \"ceilometer-0\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " pod="openstack/ceilometer-0" Jan 05 14:10:47 crc kubenswrapper[4740]: I0105 14:10:47.804778 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:48 crc kubenswrapper[4740]: I0105 14:10:48.339377 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:48 crc kubenswrapper[4740]: I0105 14:10:48.366220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerStarted","Data":"91449ab436188a302b38caa7ba571001a604c0e2a6578b10f1589e0ef93f5ff6"} Jan 05 14:10:48 crc kubenswrapper[4740]: I0105 14:10:48.982997 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392438d7-4751-48d7-b3ab-3e30200f5aa9" path="/var/lib/kubelet/pods/392438d7-4751-48d7-b3ab-3e30200f5aa9/volumes" Jan 05 14:10:49 crc kubenswrapper[4740]: I0105 14:10:49.378990 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerStarted","Data":"88b79803b998655fef119a7849d03534229196d076d8197af43083c7797d81ea"} Jan 05 14:10:50 crc kubenswrapper[4740]: I0105 14:10:50.392591 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerStarted","Data":"ce08003df15a2bbda00933e5b203c5706fd9ebf53f0eda8baf87866d74b94dc3"} Jan 05 14:10:51 crc kubenswrapper[4740]: I0105 14:10:51.406580 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerStarted","Data":"6564dd5fb8b64558ba00ffd342d387b568b5218dc4d9b75b6c50bc0c36baac73"} Jan 05 14:10:52 crc kubenswrapper[4740]: I0105 14:10:52.419779 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerStarted","Data":"ee40b3d9b9db62fd19a5cde86df79de30a32bbb0d2d4e6da391376d0f54e3221"} Jan 05 14:10:52 crc kubenswrapper[4740]: I0105 14:10:52.420345 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:10:52 crc kubenswrapper[4740]: I0105 14:10:52.451908 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.1777099 podStartE2EDuration="5.451887815s" podCreationTimestamp="2026-01-05 14:10:47 +0000 UTC" firstStartedPulling="2026-01-05 14:10:48.333789742 +0000 UTC m=+1297.640698321" lastFinishedPulling="2026-01-05 14:10:51.607967657 +0000 UTC m=+1300.914876236" observedRunningTime="2026-01-05 14:10:52.437177509 +0000 UTC m=+1301.744086108" watchObservedRunningTime="2026-01-05 14:10:52.451887815 +0000 UTC m=+1301.758796394" Jan 05 14:10:53 crc kubenswrapper[4740]: I0105 14:10:53.743889 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:54 crc kubenswrapper[4740]: I0105 14:10:54.443218 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-central-agent" containerID="cri-o://88b79803b998655fef119a7849d03534229196d076d8197af43083c7797d81ea" gracePeriod=30 Jan 05 14:10:54 crc kubenswrapper[4740]: I0105 14:10:54.443467 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="sg-core" containerID="cri-o://6564dd5fb8b64558ba00ffd342d387b568b5218dc4d9b75b6c50bc0c36baac73" gracePeriod=30 Jan 05 14:10:54 crc kubenswrapper[4740]: I0105 14:10:54.443512 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-notification-agent" containerID="cri-o://ce08003df15a2bbda00933e5b203c5706fd9ebf53f0eda8baf87866d74b94dc3" gracePeriod=30 Jan 05 14:10:54 crc kubenswrapper[4740]: I0105 14:10:54.443454 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="proxy-httpd" containerID="cri-o://ee40b3d9b9db62fd19a5cde86df79de30a32bbb0d2d4e6da391376d0f54e3221" gracePeriod=30 Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.424174 4740 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode21239f6-f726-4394-b6fe-f7f7f438d7b5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode21239f6-f726-4394-b6fe-f7f7f438d7b5] : Timed out while waiting for systemd to remove kubepods-besteffort-pode21239f6_f726_4394_b6fe_f7f7f438d7b5.slice" Jan 05 14:10:55 crc kubenswrapper[4740]: E0105 14:10:55.424374 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode21239f6-f726-4394-b6fe-f7f7f438d7b5] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode21239f6-f726-4394-b6fe-f7f7f438d7b5] : Timed out while waiting for systemd to remove kubepods-besteffort-pode21239f6_f726_4394_b6fe_f7f7f438d7b5.slice" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" podUID="e21239f6-f726-4394-b6fe-f7f7f438d7b5" Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.455644 4740 generic.go:334] "Generic (PLEG): container finished" podID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerID="ee40b3d9b9db62fd19a5cde86df79de30a32bbb0d2d4e6da391376d0f54e3221" exitCode=0 Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.455691 4740 generic.go:334] "Generic (PLEG): container finished" podID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerID="6564dd5fb8b64558ba00ffd342d387b568b5218dc4d9b75b6c50bc0c36baac73" exitCode=2 Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.455701 4740 generic.go:334] "Generic (PLEG): container finished" podID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerID="ce08003df15a2bbda00933e5b203c5706fd9ebf53f0eda8baf87866d74b94dc3" exitCode=0 Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.455774 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6419-account-create-update-fgr2l" Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.455761 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerDied","Data":"ee40b3d9b9db62fd19a5cde86df79de30a32bbb0d2d4e6da391376d0f54e3221"} Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.456220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerDied","Data":"6564dd5fb8b64558ba00ffd342d387b568b5218dc4d9b75b6c50bc0c36baac73"} Jan 05 14:10:55 crc kubenswrapper[4740]: I0105 14:10:55.456245 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerDied","Data":"ce08003df15a2bbda00933e5b203c5706fd9ebf53f0eda8baf87866d74b94dc3"} Jan 05 14:10:57 crc kubenswrapper[4740]: I0105 14:10:57.496874 4740 generic.go:334] "Generic (PLEG): container finished" podID="231fcf86-a13a-4dce-b991-609c503de1ec" containerID="55e7651216719090d790e99574bd370d4cc43c269d7cae4fb327bf843c608baf" exitCode=0 Jan 05 14:10:57 crc kubenswrapper[4740]: I0105 14:10:57.496958 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" event={"ID":"231fcf86-a13a-4dce-b991-609c503de1ec","Type":"ContainerDied","Data":"55e7651216719090d790e99574bd370d4cc43c269d7cae4fb327bf843c608baf"} Jan 05 14:10:58 crc kubenswrapper[4740]: I0105 14:10:58.510494 4740 generic.go:334] "Generic (PLEG): container finished" podID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerID="88b79803b998655fef119a7849d03534229196d076d8197af43083c7797d81ea" exitCode=0 Jan 05 14:10:58 crc kubenswrapper[4740]: I0105 14:10:58.510576 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerDied","Data":"88b79803b998655fef119a7849d03534229196d076d8197af43083c7797d81ea"} Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.001867 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.146453 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5whm9\" (UniqueName: \"kubernetes.io/projected/231fcf86-a13a-4dce-b991-609c503de1ec-kube-api-access-5whm9\") pod \"231fcf86-a13a-4dce-b991-609c503de1ec\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.146565 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-scripts\") pod \"231fcf86-a13a-4dce-b991-609c503de1ec\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.146628 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-config-data\") pod \"231fcf86-a13a-4dce-b991-609c503de1ec\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.146735 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-combined-ca-bundle\") pod \"231fcf86-a13a-4dce-b991-609c503de1ec\" (UID: \"231fcf86-a13a-4dce-b991-609c503de1ec\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.153876 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231fcf86-a13a-4dce-b991-609c503de1ec-kube-api-access-5whm9" (OuterVolumeSpecName: "kube-api-access-5whm9") pod "231fcf86-a13a-4dce-b991-609c503de1ec" (UID: "231fcf86-a13a-4dce-b991-609c503de1ec"). InnerVolumeSpecName "kube-api-access-5whm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.155234 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-scripts" (OuterVolumeSpecName: "scripts") pod "231fcf86-a13a-4dce-b991-609c503de1ec" (UID: "231fcf86-a13a-4dce-b991-609c503de1ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.191946 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "231fcf86-a13a-4dce-b991-609c503de1ec" (UID: "231fcf86-a13a-4dce-b991-609c503de1ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.196996 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-config-data" (OuterVolumeSpecName: "config-data") pod "231fcf86-a13a-4dce-b991-609c503de1ec" (UID: "231fcf86-a13a-4dce-b991-609c503de1ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.214678 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.253999 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.254048 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.254077 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231fcf86-a13a-4dce-b991-609c503de1ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.254091 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5whm9\" (UniqueName: \"kubernetes.io/projected/231fcf86-a13a-4dce-b991-609c503de1ec-kube-api-access-5whm9\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.355679 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-sg-core-conf-yaml\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.355744 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-log-httpd\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.355758 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-combined-ca-bundle\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.355836 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4jxg\" (UniqueName: \"kubernetes.io/projected/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-kube-api-access-r4jxg\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.355870 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-scripts\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.355959 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-run-httpd\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.356099 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-config-data\") pod \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\" (UID: \"3d1ae341-9055-4ead-9bbf-57e6489ef7b4\") " Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.356995 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.357013 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.359959 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-kube-api-access-r4jxg" (OuterVolumeSpecName: "kube-api-access-r4jxg") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "kube-api-access-r4jxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.364831 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-scripts" (OuterVolumeSpecName: "scripts") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.391333 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.449251 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.458497 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.458537 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.458553 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4jxg\" (UniqueName: \"kubernetes.io/projected/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-kube-api-access-r4jxg\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.458566 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.458577 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.458751 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.498162 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-config-data" (OuterVolumeSpecName: "config-data") pod "3d1ae341-9055-4ead-9bbf-57e6489ef7b4" (UID: "3d1ae341-9055-4ead-9bbf-57e6489ef7b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.524861 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d1ae341-9055-4ead-9bbf-57e6489ef7b4","Type":"ContainerDied","Data":"91449ab436188a302b38caa7ba571001a604c0e2a6578b10f1589e0ef93f5ff6"} Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.524894 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.524923 4740 scope.go:117] "RemoveContainer" containerID="ee40b3d9b9db62fd19a5cde86df79de30a32bbb0d2d4e6da391376d0f54e3221" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.536141 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" event={"ID":"231fcf86-a13a-4dce-b991-609c503de1ec","Type":"ContainerDied","Data":"6ecc2aab20fe7a76f04a2372b863aa1da5f041e5687ba53faec4bf1b7e32ef79"} Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.536208 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ecc2aab20fe7a76f04a2372b863aa1da5f041e5687ba53faec4bf1b7e32ef79" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.536300 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lsnkl" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.560740 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1ae341-9055-4ead-9bbf-57e6489ef7b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.594237 4740 scope.go:117] "RemoveContainer" containerID="6564dd5fb8b64558ba00ffd342d387b568b5218dc4d9b75b6c50bc0c36baac73" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.625841 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.637784 4740 scope.go:117] "RemoveContainer" containerID="ce08003df15a2bbda00933e5b203c5706fd9ebf53f0eda8baf87866d74b94dc3" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.665393 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.683431 4740 scope.go:117] "RemoveContainer" containerID="88b79803b998655fef119a7849d03534229196d076d8197af43083c7797d81ea" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.683432 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:10:59 crc kubenswrapper[4740]: E0105 14:10:59.684236 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="proxy-httpd" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684259 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="proxy-httpd" Jan 05 14:10:59 crc kubenswrapper[4740]: E0105 14:10:59.684296 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-central-agent" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684303 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-central-agent" Jan 05 14:10:59 crc kubenswrapper[4740]: E0105 14:10:59.684312 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-notification-agent" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684320 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-notification-agent" Jan 05 14:10:59 crc kubenswrapper[4740]: E0105 14:10:59.684378 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231fcf86-a13a-4dce-b991-609c503de1ec" containerName="nova-cell0-conductor-db-sync" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684388 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="231fcf86-a13a-4dce-b991-609c503de1ec" containerName="nova-cell0-conductor-db-sync" Jan 05 14:10:59 crc kubenswrapper[4740]: E0105 14:10:59.684408 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="sg-core" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684415 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="sg-core" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684668 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="proxy-httpd" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684683 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="231fcf86-a13a-4dce-b991-609c503de1ec" containerName="nova-cell0-conductor-db-sync" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684700 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-notification-agent" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684722 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="sg-core" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.684741 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" containerName="ceilometer-central-agent" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.685770 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.691853 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-drccb" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.692115 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.697952 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.715430 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.717758 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.718280 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.736409 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.747467 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:10:59 crc kubenswrapper[4740]: E0105 14:10:59.770416 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231fcf86_a13a_4dce_b991_609c503de1ec.slice\": RecentStats: unable to find data in memory cache]" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.871619 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-log-httpd\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.871861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kz5\" (UniqueName: \"kubernetes.io/projected/602b77b2-3d49-4b22-bb8c-6bcb33894264-kube-api-access-67kz5\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.871972 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-run-httpd\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872121 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdpk\" (UniqueName: \"kubernetes.io/projected/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-kube-api-access-qrdpk\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872207 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-config-data\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872458 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-scripts\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872506 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872704 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872739 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.872760 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974361 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-log-httpd\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974428 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kz5\" (UniqueName: \"kubernetes.io/projected/602b77b2-3d49-4b22-bb8c-6bcb33894264-kube-api-access-67kz5\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974491 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-run-httpd\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974586 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdpk\" (UniqueName: \"kubernetes.io/projected/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-kube-api-access-qrdpk\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974621 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-config-data\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974693 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-scripts\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974720 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974788 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974814 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.974836 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.976368 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-log-httpd\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.976819 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-run-httpd\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.978744 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.981855 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.982252 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.983540 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.984160 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-scripts\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.984325 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-config-data\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.995127 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kz5\" (UniqueName: \"kubernetes.io/projected/602b77b2-3d49-4b22-bb8c-6bcb33894264-kube-api-access-67kz5\") pod \"nova-cell0-conductor-0\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:10:59 crc kubenswrapper[4740]: I0105 14:10:59.998317 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdpk\" (UniqueName: \"kubernetes.io/projected/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-kube-api-access-qrdpk\") pod \"ceilometer-0\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " pod="openstack/ceilometer-0" Jan 05 14:11:00 crc kubenswrapper[4740]: I0105 14:11:00.012417 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:00 crc kubenswrapper[4740]: I0105 14:11:00.037435 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:00 crc kubenswrapper[4740]: I0105 14:11:00.522849 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:00 crc kubenswrapper[4740]: I0105 14:11:00.548492 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"602b77b2-3d49-4b22-bb8c-6bcb33894264","Type":"ContainerStarted","Data":"f3cc2a285eee28588f055b48a364f42995d79a0d4cb8fa87beed1bf37c188320"} Jan 05 14:11:00 crc kubenswrapper[4740]: W0105 14:11:00.654160 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc1e0c7_eaaf_49c8_aa2c_88fe788bf6fa.slice/crio-532599b2f2d53993e7fdbcca984ae7c43873eda101fd8aedf39995c2ca4a492f WatchSource:0}: Error finding container 532599b2f2d53993e7fdbcca984ae7c43873eda101fd8aedf39995c2ca4a492f: Status 404 returned error can't find the container with id 532599b2f2d53993e7fdbcca984ae7c43873eda101fd8aedf39995c2ca4a492f Jan 05 14:11:00 crc kubenswrapper[4740]: I0105 14:11:00.656467 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:00 crc kubenswrapper[4740]: I0105 14:11:00.992727 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1ae341-9055-4ead-9bbf-57e6489ef7b4" path="/var/lib/kubelet/pods/3d1ae341-9055-4ead-9bbf-57e6489ef7b4/volumes" Jan 05 14:11:01 crc kubenswrapper[4740]: I0105 14:11:01.585281 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerStarted","Data":"b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a"} Jan 05 14:11:01 crc kubenswrapper[4740]: I0105 14:11:01.586393 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerStarted","Data":"532599b2f2d53993e7fdbcca984ae7c43873eda101fd8aedf39995c2ca4a492f"} Jan 05 14:11:01 crc kubenswrapper[4740]: I0105 14:11:01.601309 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"602b77b2-3d49-4b22-bb8c-6bcb33894264","Type":"ContainerStarted","Data":"b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76"} Jan 05 14:11:01 crc kubenswrapper[4740]: I0105 14:11:01.601761 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:01 crc kubenswrapper[4740]: I0105 14:11:01.619418 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.619402274 podStartE2EDuration="2.619402274s" podCreationTimestamp="2026-01-05 14:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:01.613847135 +0000 UTC m=+1310.920755714" watchObservedRunningTime="2026-01-05 14:11:01.619402274 +0000 UTC m=+1310.926310843" Jan 05 14:11:02 crc kubenswrapper[4740]: I0105 14:11:02.644757 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerStarted","Data":"bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0"} Jan 05 14:11:03 crc kubenswrapper[4740]: I0105 14:11:03.656499 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerStarted","Data":"3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7"} Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.045201 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.615037 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-g9xk2"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.620306 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.626307 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.626549 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.630172 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-g9xk2"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.692371 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerStarted","Data":"f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705"} Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.692841 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.721447 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.653083389 podStartE2EDuration="6.721430946s" podCreationTimestamp="2026-01-05 14:10:59 +0000 UTC" firstStartedPulling="2026-01-05 14:11:00.656828662 +0000 UTC m=+1309.963737241" lastFinishedPulling="2026-01-05 14:11:04.725176219 +0000 UTC m=+1314.032084798" observedRunningTime="2026-01-05 14:11:05.71673092 +0000 UTC m=+1315.023639509" watchObservedRunningTime="2026-01-05 14:11:05.721430946 +0000 UTC m=+1315.028339525" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.722235 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.722303 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-scripts\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.722334 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-config-data\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.722522 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7qrl\" (UniqueName: \"kubernetes.io/projected/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-kube-api-access-d7qrl\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.775619 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.777760 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.781613 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.803787 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824243 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824286 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpszf\" (UniqueName: \"kubernetes.io/projected/86f37a8e-5506-4845-ae5b-a71374c00a5f-kube-api-access-wpszf\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824331 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-scripts\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824354 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824374 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-config-data\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824479 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86f37a8e-5506-4845-ae5b-a71374c00a5f-logs\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824644 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-config-data\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.824840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7qrl\" (UniqueName: \"kubernetes.io/projected/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-kube-api-access-d7qrl\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.844579 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-config-data\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.848445 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.851006 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.854139 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.854574 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.854650 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7qrl\" (UniqueName: \"kubernetes.io/projected/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-kube-api-access-d7qrl\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.875687 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-scripts\") pod \"nova-cell0-cell-mapping-g9xk2\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.892460 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927476 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpszf\" (UniqueName: \"kubernetes.io/projected/86f37a8e-5506-4845-ae5b-a71374c00a5f-kube-api-access-wpszf\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927538 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927589 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86f37a8e-5506-4845-ae5b-a71374c00a5f-logs\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927622 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de290cbf-7ff1-423b-b2d2-b7232b2165e4-logs\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927639 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-config-data\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927673 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79vk\" (UniqueName: \"kubernetes.io/projected/de290cbf-7ff1-423b-b2d2-b7232b2165e4-kube-api-access-b79vk\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927705 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-config-data\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.927722 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.932287 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86f37a8e-5506-4845-ae5b-a71374c00a5f-logs\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.938751 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.942693 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.946301 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-config-data\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.980540 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpszf\" (UniqueName: \"kubernetes.io/projected/86f37a8e-5506-4845-ae5b-a71374c00a5f-kube-api-access-wpszf\") pod \"nova-api-0\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " pod="openstack/nova-api-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.983824 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.985650 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:11:05 crc kubenswrapper[4740]: I0105 14:11:05.995259 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.032622 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.032785 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de290cbf-7ff1-423b-b2d2-b7232b2165e4-logs\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.032813 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-config-data\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.032864 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kww67\" (UniqueName: \"kubernetes.io/projected/62896e4a-182c-414b-a31f-0890587c3498-kube-api-access-kww67\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.032883 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b79vk\" (UniqueName: \"kubernetes.io/projected/de290cbf-7ff1-423b-b2d2-b7232b2165e4-kube-api-access-b79vk\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.032915 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.033015 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-config-data\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.035948 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de290cbf-7ff1-423b-b2d2-b7232b2165e4-logs\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.045767 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.062535 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79vk\" (UniqueName: \"kubernetes.io/projected/de290cbf-7ff1-423b-b2d2-b7232b2165e4-kube-api-access-b79vk\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.067737 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-config-data\") pod \"nova-metadata-0\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.080142 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-jqjqm"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.082234 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.097999 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.120654 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.134563 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136757 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-config\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136794 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136837 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-config-data\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136903 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136921 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl6kl\" (UniqueName: \"kubernetes.io/projected/69df31f0-004d-41d9-8c4c-c0bc865ff354-kube-api-access-hl6kl\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136960 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.136987 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.137079 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.137117 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kww67\" (UniqueName: \"kubernetes.io/projected/62896e4a-182c-414b-a31f-0890587c3498-kube-api-access-kww67\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.141824 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.151495 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-jqjqm"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.161427 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kww67\" (UniqueName: \"kubernetes.io/projected/62896e4a-182c-414b-a31f-0890587c3498-kube-api-access-kww67\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.179676 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-config-data\") pod \"nova-scheduler-0\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.239234 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.243382 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.243625 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-config\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.243658 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.243864 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl6kl\" (UniqueName: \"kubernetes.io/projected/69df31f0-004d-41d9-8c4c-c0bc865ff354-kube-api-access-hl6kl\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.244869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.244937 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.244988 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.246333 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.247962 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.249639 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.250293 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.250912 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-config\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.251679 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.271468 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl6kl\" (UniqueName: \"kubernetes.io/projected/69df31f0-004d-41d9-8c4c-c0bc865ff354-kube-api-access-hl6kl\") pod \"dnsmasq-dns-568d7fd7cf-jqjqm\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.275582 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.348470 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmq5x\" (UniqueName: \"kubernetes.io/projected/34b94c66-6d17-44d2-acb7-c9033986fedb-kube-api-access-qmq5x\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.349797 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.351636 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.458600 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmq5x\" (UniqueName: \"kubernetes.io/projected/34b94c66-6d17-44d2-acb7-c9033986fedb-kube-api-access-qmq5x\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.458982 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.459333 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.459798 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.474801 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.475237 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.488644 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmq5x\" (UniqueName: \"kubernetes.io/projected/34b94c66-6d17-44d2-acb7-c9033986fedb-kube-api-access-qmq5x\") pod \"nova-cell1-novncproxy-0\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.499543 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.581612 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.608709 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-g9xk2"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.719023 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g9xk2" event={"ID":"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb","Type":"ContainerStarted","Data":"ffe2d450a85a81ea9c5cd3dc4b50eb372af920815808acaedbd629bcec750e21"} Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.902501 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:06 crc kubenswrapper[4740]: I0105 14:11:06.929536 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.397968 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-jqjqm"] Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.467930 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.490657 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.730905 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34b94c66-6d17-44d2-acb7-c9033986fedb","Type":"ContainerStarted","Data":"a70580802567064d454a9e78d85064c2e779396b34a0ab621b2c54e6cb9c6c90"} Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.735506 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de290cbf-7ff1-423b-b2d2-b7232b2165e4","Type":"ContainerStarted","Data":"b91d5055f69aa67f4b97509ebe2f77d45cea9a1f02f7584391fb713344b3fae8"} Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.738248 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" event={"ID":"69df31f0-004d-41d9-8c4c-c0bc865ff354","Type":"ContainerStarted","Data":"ef314bcd6be6683cb1c18c4f6c2e8d69e561b18350d2d2c4626c9e7dbef901d7"} Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.739749 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g9xk2" event={"ID":"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb","Type":"ContainerStarted","Data":"a422b28e96fb5cde02201a4137a02f228c697a541da763888ce490fd33b1b8fb"} Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.740767 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62896e4a-182c-414b-a31f-0890587c3498","Type":"ContainerStarted","Data":"81f43bed880c54a2e3fa7d58850402e33c9299d88fac6403e7d1c6c3cbe64a53"} Jan 05 14:11:07 crc kubenswrapper[4740]: I0105 14:11:07.752178 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86f37a8e-5506-4845-ae5b-a71374c00a5f","Type":"ContainerStarted","Data":"d0e05826b308f443f6bff3fc2203123939b8bd1e697c4d781d697448b636b24e"} Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.337423 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-g9xk2" podStartSLOduration=3.33740429 podStartE2EDuration="3.33740429s" podCreationTimestamp="2026-01-05 14:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:07.764290494 +0000 UTC m=+1317.071199073" watchObservedRunningTime="2026-01-05 14:11:08.33740429 +0000 UTC m=+1317.644312869" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.349479 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-25blk"] Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.351043 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.356815 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.356991 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.372303 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-25blk"] Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.536224 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-scripts\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.536365 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.536436 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r78xf\" (UniqueName: \"kubernetes.io/projected/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-kube-api-access-r78xf\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.536486 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-config-data\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.638668 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-config-data\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.638818 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-scripts\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.638898 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.638961 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r78xf\" (UniqueName: \"kubernetes.io/projected/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-kube-api-access-r78xf\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.647657 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-scripts\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.657203 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-config-data\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.661197 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.680120 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r78xf\" (UniqueName: \"kubernetes.io/projected/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-kube-api-access-r78xf\") pod \"nova-cell1-conductor-db-sync-25blk\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.779993 4740 generic.go:334] "Generic (PLEG): container finished" podID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerID="b81f88f084252f3f4c848a191096c8570b19f56426c0c005faa9322e001569fc" exitCode=0 Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.780124 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" event={"ID":"69df31f0-004d-41d9-8c4c-c0bc865ff354","Type":"ContainerDied","Data":"b81f88f084252f3f4c848a191096c8570b19f56426c0c005faa9322e001569fc"} Jan 05 14:11:08 crc kubenswrapper[4740]: I0105 14:11:08.972803 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:09 crc kubenswrapper[4740]: I0105 14:11:09.644919 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:09 crc kubenswrapper[4740]: I0105 14:11:09.680123 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:09 crc kubenswrapper[4740]: I0105 14:11:09.691926 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-25blk"] Jan 05 14:11:09 crc kubenswrapper[4740]: I0105 14:11:09.791054 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" event={"ID":"69df31f0-004d-41d9-8c4c-c0bc865ff354","Type":"ContainerStarted","Data":"b26e21512e236401b079a1d8112670b3253bcf4216720cc81e8ff27ebe6cb1de"} Jan 05 14:11:09 crc kubenswrapper[4740]: I0105 14:11:09.791238 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:09 crc kubenswrapper[4740]: I0105 14:11:09.811492 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" podStartSLOduration=4.811475466 podStartE2EDuration="4.811475466s" podCreationTimestamp="2026-01-05 14:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:09.808361662 +0000 UTC m=+1319.115270241" watchObservedRunningTime="2026-01-05 14:11:09.811475466 +0000 UTC m=+1319.118384045" Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.537697 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.538110 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="602b77b2-3d49-4b22-bb8c-6bcb33894264" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76" gracePeriod=30 Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.551022 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.567174 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.761142 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.761410 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-central-agent" containerID="cri-o://b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a" gracePeriod=30 Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.761478 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="sg-core" containerID="cri-o://3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7" gracePeriod=30 Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.761525 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-notification-agent" containerID="cri-o://bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0" gracePeriod=30 Jan 05 14:11:10 crc kubenswrapper[4740]: I0105 14:11:10.761623 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="proxy-httpd" containerID="cri-o://f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705" gracePeriod=30 Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.815154 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-25blk" event={"ID":"6eb2618b-5bb1-4caf-a6f9-708cddcbf390","Type":"ContainerStarted","Data":"0900e907a9e5df79c3b1e4ffdbd4352986ef809763a88ca1ee8fb5ffd6fe9d33"} Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.818176 4740 generic.go:334] "Generic (PLEG): container finished" podID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerID="f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705" exitCode=0 Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.818207 4740 generic.go:334] "Generic (PLEG): container finished" podID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerID="3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7" exitCode=2 Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.818218 4740 generic.go:334] "Generic (PLEG): container finished" podID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerID="bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0" exitCode=0 Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.818237 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerDied","Data":"f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705"} Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.818262 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerDied","Data":"3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7"} Jan 05 14:11:11 crc kubenswrapper[4740]: I0105 14:11:11.818271 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerDied","Data":"bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.839277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34b94c66-6d17-44d2-acb7-c9033986fedb","Type":"ContainerStarted","Data":"da8722ae38c11d4d22d5248a4269653e2ca7438958f74516b2bdbc138e16d8fc"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.839903 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="34b94c66-6d17-44d2-acb7-c9033986fedb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://da8722ae38c11d4d22d5248a4269653e2ca7438958f74516b2bdbc138e16d8fc" gracePeriod=30 Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.843489 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de290cbf-7ff1-423b-b2d2-b7232b2165e4","Type":"ContainerStarted","Data":"38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.843521 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de290cbf-7ff1-423b-b2d2-b7232b2165e4","Type":"ContainerStarted","Data":"2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.843637 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-metadata" containerID="cri-o://38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89" gracePeriod=30 Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.843882 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-log" containerID="cri-o://2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c" gracePeriod=30 Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.847873 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-25blk" event={"ID":"6eb2618b-5bb1-4caf-a6f9-708cddcbf390","Type":"ContainerStarted","Data":"83a35e5626bd0af0822fc7d563c2a07eae40ef5243d0cecbe9a448d003703d5a"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.850334 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="62896e4a-182c-414b-a31f-0890587c3498" containerName="nova-scheduler-scheduler" containerID="cri-o://d95b3e320a1f06d72f5929bd0ae0d85f655d519b1c91c50cd6160b05bced811f" gracePeriod=30 Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.850435 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62896e4a-182c-414b-a31f-0890587c3498","Type":"ContainerStarted","Data":"d95b3e320a1f06d72f5929bd0ae0d85f655d519b1c91c50cd6160b05bced811f"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.860168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86f37a8e-5506-4845-ae5b-a71374c00a5f","Type":"ContainerStarted","Data":"2de6ea5db28ce1799e9e0aaa0cd943eae5c6a962aad50c42d163f9ccf6a23954"} Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.860321 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-log" containerID="cri-o://2de6ea5db28ce1799e9e0aaa0cd943eae5c6a962aad50c42d163f9ccf6a23954" gracePeriod=30 Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.860664 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-api" containerID="cri-o://bc865f45ae6bf992b6a2809d4faed867c29ce884f712346dfc72c74f9a7c4334" gracePeriod=30 Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.864726 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.189420646 podStartE2EDuration="7.864706119s" podCreationTimestamp="2026-01-05 14:11:05 +0000 UTC" firstStartedPulling="2026-01-05 14:11:07.472444387 +0000 UTC m=+1316.779352966" lastFinishedPulling="2026-01-05 14:11:12.14772986 +0000 UTC m=+1321.454638439" observedRunningTime="2026-01-05 14:11:12.852827761 +0000 UTC m=+1322.159736350" watchObservedRunningTime="2026-01-05 14:11:12.864706119 +0000 UTC m=+1322.171614698" Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.875940 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.674498243 podStartE2EDuration="7.875922781s" podCreationTimestamp="2026-01-05 14:11:05 +0000 UTC" firstStartedPulling="2026-01-05 14:11:06.946163318 +0000 UTC m=+1316.253071897" lastFinishedPulling="2026-01-05 14:11:12.147587856 +0000 UTC m=+1321.454496435" observedRunningTime="2026-01-05 14:11:12.871809971 +0000 UTC m=+1322.178718550" watchObservedRunningTime="2026-01-05 14:11:12.875922781 +0000 UTC m=+1322.182831350" Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.893641 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-25blk" podStartSLOduration=4.893624887 podStartE2EDuration="4.893624887s" podCreationTimestamp="2026-01-05 14:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:12.890080001 +0000 UTC m=+1322.196988580" watchObservedRunningTime="2026-01-05 14:11:12.893624887 +0000 UTC m=+1322.200533466" Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.920297 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.185082531 podStartE2EDuration="7.920276022s" podCreationTimestamp="2026-01-05 14:11:05 +0000 UTC" firstStartedPulling="2026-01-05 14:11:07.453448188 +0000 UTC m=+1316.760356767" lastFinishedPulling="2026-01-05 14:11:12.188641679 +0000 UTC m=+1321.495550258" observedRunningTime="2026-01-05 14:11:12.913665054 +0000 UTC m=+1322.220573633" watchObservedRunningTime="2026-01-05 14:11:12.920276022 +0000 UTC m=+1322.227184601" Jan 05 14:11:12 crc kubenswrapper[4740]: I0105 14:11:12.941904 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.70570577 podStartE2EDuration="7.941888922s" podCreationTimestamp="2026-01-05 14:11:05 +0000 UTC" firstStartedPulling="2026-01-05 14:11:06.923847139 +0000 UTC m=+1316.230755718" lastFinishedPulling="2026-01-05 14:11:12.160030261 +0000 UTC m=+1321.466938870" observedRunningTime="2026-01-05 14:11:12.934978216 +0000 UTC m=+1322.241886795" watchObservedRunningTime="2026-01-05 14:11:12.941888922 +0000 UTC m=+1322.248797501" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.758662 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.771457 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-combined-ca-bundle\") pod \"602b77b2-3d49-4b22-bb8c-6bcb33894264\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.771668 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67kz5\" (UniqueName: \"kubernetes.io/projected/602b77b2-3d49-4b22-bb8c-6bcb33894264-kube-api-access-67kz5\") pod \"602b77b2-3d49-4b22-bb8c-6bcb33894264\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.771766 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-config-data\") pod \"602b77b2-3d49-4b22-bb8c-6bcb33894264\" (UID: \"602b77b2-3d49-4b22-bb8c-6bcb33894264\") " Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.804977 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602b77b2-3d49-4b22-bb8c-6bcb33894264-kube-api-access-67kz5" (OuterVolumeSpecName: "kube-api-access-67kz5") pod "602b77b2-3d49-4b22-bb8c-6bcb33894264" (UID: "602b77b2-3d49-4b22-bb8c-6bcb33894264"). InnerVolumeSpecName "kube-api-access-67kz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.835072 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "602b77b2-3d49-4b22-bb8c-6bcb33894264" (UID: "602b77b2-3d49-4b22-bb8c-6bcb33894264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.861258 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-config-data" (OuterVolumeSpecName: "config-data") pod "602b77b2-3d49-4b22-bb8c-6bcb33894264" (UID: "602b77b2-3d49-4b22-bb8c-6bcb33894264"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.875305 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.875340 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67kz5\" (UniqueName: \"kubernetes.io/projected/602b77b2-3d49-4b22-bb8c-6bcb33894264-kube-api-access-67kz5\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.875355 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602b77b2-3d49-4b22-bb8c-6bcb33894264-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.876099 4740 generic.go:334] "Generic (PLEG): container finished" podID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerID="2de6ea5db28ce1799e9e0aaa0cd943eae5c6a962aad50c42d163f9ccf6a23954" exitCode=143 Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.876158 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86f37a8e-5506-4845-ae5b-a71374c00a5f","Type":"ContainerStarted","Data":"bc865f45ae6bf992b6a2809d4faed867c29ce884f712346dfc72c74f9a7c4334"} Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.876183 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86f37a8e-5506-4845-ae5b-a71374c00a5f","Type":"ContainerDied","Data":"2de6ea5db28ce1799e9e0aaa0cd943eae5c6a962aad50c42d163f9ccf6a23954"} Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.879327 4740 generic.go:334] "Generic (PLEG): container finished" podID="602b77b2-3d49-4b22-bb8c-6bcb33894264" containerID="b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76" exitCode=0 Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.879410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"602b77b2-3d49-4b22-bb8c-6bcb33894264","Type":"ContainerDied","Data":"b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76"} Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.879436 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"602b77b2-3d49-4b22-bb8c-6bcb33894264","Type":"ContainerDied","Data":"f3cc2a285eee28588f055b48a364f42995d79a0d4cb8fa87beed1bf37c188320"} Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.879442 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.879452 4740 scope.go:117] "RemoveContainer" containerID="b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.884356 4740 generic.go:334] "Generic (PLEG): container finished" podID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerID="2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c" exitCode=143 Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.884391 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de290cbf-7ff1-423b-b2d2-b7232b2165e4","Type":"ContainerDied","Data":"2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c"} Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.920290 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.920465 4740 scope.go:117] "RemoveContainer" containerID="b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76" Jan 05 14:11:13 crc kubenswrapper[4740]: E0105 14:11:13.922176 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76\": container with ID starting with b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76 not found: ID does not exist" containerID="b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.922217 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76"} err="failed to get container status \"b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76\": rpc error: code = NotFound desc = could not find container \"b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76\": container with ID starting with b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76 not found: ID does not exist" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.942592 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.954290 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:13 crc kubenswrapper[4740]: E0105 14:11:13.954822 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602b77b2-3d49-4b22-bb8c-6bcb33894264" containerName="nova-cell0-conductor-conductor" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.954839 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="602b77b2-3d49-4b22-bb8c-6bcb33894264" containerName="nova-cell0-conductor-conductor" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.955204 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="602b77b2-3d49-4b22-bb8c-6bcb33894264" containerName="nova-cell0-conductor-conductor" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.956027 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:13 crc kubenswrapper[4740]: I0105 14:11:13.974776 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.010296 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.085730 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a12d4e-701a-4535-9755-79b170faffa6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.085801 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx54p\" (UniqueName: \"kubernetes.io/projected/20a12d4e-701a-4535-9755-79b170faffa6-kube-api-access-zx54p\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.085904 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a12d4e-701a-4535-9755-79b170faffa6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.188254 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a12d4e-701a-4535-9755-79b170faffa6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.188312 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx54p\" (UniqueName: \"kubernetes.io/projected/20a12d4e-701a-4535-9755-79b170faffa6-kube-api-access-zx54p\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.188407 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a12d4e-701a-4535-9755-79b170faffa6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.192094 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a12d4e-701a-4535-9755-79b170faffa6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.192561 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a12d4e-701a-4535-9755-79b170faffa6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.209555 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx54p\" (UniqueName: \"kubernetes.io/projected/20a12d4e-701a-4535-9755-79b170faffa6-kube-api-access-zx54p\") pod \"nova-cell0-conductor-0\" (UID: \"20a12d4e-701a-4535-9755-79b170faffa6\") " pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.305772 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.986379 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602b77b2-3d49-4b22-bb8c-6bcb33894264" path="/var/lib/kubelet/pods/602b77b2-3d49-4b22-bb8c-6bcb33894264/volumes" Jan 05 14:11:14 crc kubenswrapper[4740]: I0105 14:11:14.994762 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 05 14:11:15 crc kubenswrapper[4740]: I0105 14:11:15.918585 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"20a12d4e-701a-4535-9755-79b170faffa6","Type":"ContainerStarted","Data":"0d1cfae51dcbe6d8f77145458cef7ed503a063ceed19c15ca42a6ffb035eb774"} Jan 05 14:11:15 crc kubenswrapper[4740]: I0105 14:11:15.918830 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"20a12d4e-701a-4535-9755-79b170faffa6","Type":"ContainerStarted","Data":"780ad57bdb0b9481b268c68c0a24acb84e24eb18edaa27d1917ee6c441f359ac"} Jan 05 14:11:15 crc kubenswrapper[4740]: I0105 14:11:15.918859 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.122098 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.122137 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.461304 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.501407 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.527051 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.527028226 podStartE2EDuration="3.527028226s" podCreationTimestamp="2026-01-05 14:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:15.94577125 +0000 UTC m=+1325.252679829" watchObservedRunningTime="2026-01-05 14:11:16.527028226 +0000 UTC m=+1325.833936805" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.571408 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-dfp7f"] Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.571973 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="dnsmasq-dns" containerID="cri-o://3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c" gracePeriod=10 Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.582259 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.930920 4740 generic.go:334] "Generic (PLEG): container finished" podID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerID="3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c" exitCode=0 Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.931012 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" event={"ID":"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2","Type":"ContainerDied","Data":"3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c"} Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.932669 4740 generic.go:334] "Generic (PLEG): container finished" podID="2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" containerID="a422b28e96fb5cde02201a4137a02f228c697a541da763888ce490fd33b1b8fb" exitCode=0 Jan 05 14:11:16 crc kubenswrapper[4740]: I0105 14:11:16.933833 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g9xk2" event={"ID":"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb","Type":"ContainerDied","Data":"a422b28e96fb5cde02201a4137a02f228c697a541da763888ce490fd33b1b8fb"} Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.222140 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:11:17 crc kubenswrapper[4740]: E0105 14:11:17.373550 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc1e0c7_eaaf_49c8_aa2c_88fe788bf6fa.slice/crio-b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e5db2b6_d22f_43eb_9a2a_7fa00d9b9cbb.slice/crio-a422b28e96fb5cde02201a4137a02f228c697a541da763888ce490fd33b1b8fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602b77b2_3d49_4b22_bb8c_6bcb33894264.slice/crio-f3cc2a285eee28588f055b48a364f42995d79a0d4cb8fa87beed1bf37c188320\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a8de10c_cfde_436d_bdc9_2a42c89ac2c2.slice/crio-conmon-3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dc1e0c7_eaaf_49c8_aa2c_88fe788bf6fa.slice/crio-conmon-b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602b77b2_3d49_4b22_bb8c_6bcb33894264.slice/crio-b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e5db2b6_d22f_43eb_9a2a_7fa00d9b9cbb.slice/crio-conmon-a422b28e96fb5cde02201a4137a02f228c697a541da763888ce490fd33b1b8fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602b77b2_3d49_4b22_bb8c_6bcb33894264.slice/crio-conmon-b89ac7c4f41907a5412d534b6d46dca3dc002bcf08d6c0cd81bc1d9a05b9ef76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod602b77b2_3d49_4b22_bb8c_6bcb33894264.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a8de10c_cfde_436d_bdc9_2a42c89ac2c2.slice/crio-3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.374912 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-config\") pod \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.374970 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-svc\") pod \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.375027 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-sb\") pod \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.375126 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-swift-storage-0\") pod \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.375159 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-nb\") pod \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.376021 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdnn\" (UniqueName: \"kubernetes.io/projected/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-kube-api-access-dhdnn\") pod \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\" (UID: \"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.384237 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-kube-api-access-dhdnn" (OuterVolumeSpecName: "kube-api-access-dhdnn") pod "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" (UID: "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2"). InnerVolumeSpecName "kube-api-access-dhdnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.439714 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" (UID: "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.444798 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" (UID: "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.447949 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" (UID: "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.453663 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" (UID: "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.458259 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.459032 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-config" (OuterVolumeSpecName: "config") pod "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" (UID: "6a8de10c-cfde-436d-bdc9-2a42c89ac2c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.479235 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.479262 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.479273 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.479284 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.479297 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.479305 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhdnn\" (UniqueName: \"kubernetes.io/projected/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2-kube-api-access-dhdnn\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.580807 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-combined-ca-bundle\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.580924 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-config-data\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.580956 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-sg-core-conf-yaml\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.581083 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-run-httpd\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.581102 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-log-httpd\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.581123 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-scripts\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.581186 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrdpk\" (UniqueName: \"kubernetes.io/projected/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-kube-api-access-qrdpk\") pod \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\" (UID: \"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa\") " Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.582409 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.582420 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.585387 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-scripts" (OuterVolumeSpecName: "scripts") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.585598 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-kube-api-access-qrdpk" (OuterVolumeSpecName: "kube-api-access-qrdpk") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "kube-api-access-qrdpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.618230 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.669767 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.685167 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.685216 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.685236 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.685254 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrdpk\" (UniqueName: \"kubernetes.io/projected/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-kube-api-access-qrdpk\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.685274 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.685293 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.726271 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-config-data" (OuterVolumeSpecName: "config-data") pod "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" (UID: "4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.787311 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.951692 4740 generic.go:334] "Generic (PLEG): container finished" podID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerID="b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a" exitCode=0 Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.951750 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerDied","Data":"b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a"} Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.951805 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa","Type":"ContainerDied","Data":"532599b2f2d53993e7fdbcca984ae7c43873eda101fd8aedf39995c2ca4a492f"} Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.951808 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.951828 4740 scope.go:117] "RemoveContainer" containerID="f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.955519 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.955517 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" event={"ID":"6a8de10c-cfde-436d-bdc9-2a42c89ac2c2","Type":"ContainerDied","Data":"75c0ac1698ad5b97084bae244602875970b924e847ca51719de832d00f4ed0b0"} Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.978631 4740 scope.go:117] "RemoveContainer" containerID="3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7" Jan 05 14:11:17 crc kubenswrapper[4740]: I0105 14:11:17.998886 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-dfp7f"] Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.019238 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-dfp7f"] Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.032388 4740 scope.go:117] "RemoveContainer" containerID="bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.037041 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.054799 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.066209 4740 scope.go:117] "RemoveContainer" containerID="b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067260 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.067820 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-central-agent" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067834 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-central-agent" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.067870 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="sg-core" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067876 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="sg-core" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.067883 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="dnsmasq-dns" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067890 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="dnsmasq-dns" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.067900 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="init" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067905 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="init" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.067921 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-notification-agent" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067929 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-notification-agent" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.067953 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="proxy-httpd" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.067959 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="proxy-httpd" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.068232 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-central-agent" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.068260 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="dnsmasq-dns" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.068281 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="proxy-httpd" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.068300 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="sg-core" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.068312 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" containerName="ceilometer-notification-agent" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.075836 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.077688 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.077997 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.080234 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.103878 4740 scope.go:117] "RemoveContainer" containerID="f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.104450 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705\": container with ID starting with f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705 not found: ID does not exist" containerID="f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.104475 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705"} err="failed to get container status \"f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705\": rpc error: code = NotFound desc = could not find container \"f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705\": container with ID starting with f1b57a19da1eb0b94f75cdc9a17d877d25592b519b508c942988e8c9589f7705 not found: ID does not exist" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.104494 4740 scope.go:117] "RemoveContainer" containerID="3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.104826 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7\": container with ID starting with 3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7 not found: ID does not exist" containerID="3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.104840 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7"} err="failed to get container status \"3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7\": rpc error: code = NotFound desc = could not find container \"3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7\": container with ID starting with 3932fe7a0a33999c636ebb58db8e52413e22e6a0c754b156c39d4dca6619b1f7 not found: ID does not exist" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.104851 4740 scope.go:117] "RemoveContainer" containerID="bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.105806 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0\": container with ID starting with bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0 not found: ID does not exist" containerID="bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.105828 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0"} err="failed to get container status \"bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0\": rpc error: code = NotFound desc = could not find container \"bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0\": container with ID starting with bd55d3fd52610b600f63b12b10d3999d91fd7d0f99f99b64d5b0a623f07cc1c0 not found: ID does not exist" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.105841 4740 scope.go:117] "RemoveContainer" containerID="b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a" Jan 05 14:11:18 crc kubenswrapper[4740]: E0105 14:11:18.106705 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a\": container with ID starting with b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a not found: ID does not exist" containerID="b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.106726 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a"} err="failed to get container status \"b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a\": rpc error: code = NotFound desc = could not find container \"b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a\": container with ID starting with b65f6486e8d2516657f1aaf405dacfeb8061d83b3f1d395912f11d009d94aa1a not found: ID does not exist" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.106740 4740 scope.go:117] "RemoveContainer" containerID="3910165a921eccca67f947ed5072f932950b693b128335a004d40fe70025c29c" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.133452 4740 scope.go:117] "RemoveContainer" containerID="f3ea946a9d1c5c6d0b54824bbec0b4034a55e57be80cde59ac7029bb830a11e2" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196120 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-config-data\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196181 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-scripts\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196278 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196757 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-log-httpd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-run-httpd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196946 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96sdd\" (UniqueName: \"kubernetes.io/projected/bc1cd596-cff5-4f6f-9383-78a21bfd139f-kube-api-access-96sdd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.196980 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299332 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-config-data\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299388 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-scripts\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299414 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299610 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-log-httpd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299637 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-run-httpd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299748 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96sdd\" (UniqueName: \"kubernetes.io/projected/bc1cd596-cff5-4f6f-9383-78a21bfd139f-kube-api-access-96sdd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.299778 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.300343 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-run-httpd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.300391 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-log-httpd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.305218 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.305235 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-scripts\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.306131 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-config-data\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.310121 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.319219 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96sdd\" (UniqueName: \"kubernetes.io/projected/bc1cd596-cff5-4f6f-9383-78a21bfd139f-kube-api-access-96sdd\") pod \"ceilometer-0\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.395648 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.449279 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.520872 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-combined-ca-bundle\") pod \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.520962 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-config-data\") pod \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.521101 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-scripts\") pod \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.521275 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7qrl\" (UniqueName: \"kubernetes.io/projected/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-kube-api-access-d7qrl\") pod \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\" (UID: \"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb\") " Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.526307 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-scripts" (OuterVolumeSpecName: "scripts") pod "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" (UID: "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.526942 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-kube-api-access-d7qrl" (OuterVolumeSpecName: "kube-api-access-d7qrl") pod "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" (UID: "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb"). InnerVolumeSpecName "kube-api-access-d7qrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.564116 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-config-data" (OuterVolumeSpecName: "config-data") pod "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" (UID: "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.573279 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" (UID: "2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.624759 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.625094 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7qrl\" (UniqueName: \"kubernetes.io/projected/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-kube-api-access-d7qrl\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.625110 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.625121 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.908558 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:18 crc kubenswrapper[4740]: W0105 14:11:18.909606 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc1cd596_cff5_4f6f_9383_78a21bfd139f.slice/crio-94d364921ae859657f2214c898ec662c532a83578ee58f8ce9202889c24f7ffc WatchSource:0}: Error finding container 94d364921ae859657f2214c898ec662c532a83578ee58f8ce9202889c24f7ffc: Status 404 returned error can't find the container with id 94d364921ae859657f2214c898ec662c532a83578ee58f8ce9202889c24f7ffc Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.997194 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa" path="/var/lib/kubelet/pods/4dc1e0c7-eaaf-49c8-aa2c-88fe788bf6fa/volumes" Jan 05 14:11:18 crc kubenswrapper[4740]: I0105 14:11:18.999872 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-g9xk2" Jan 05 14:11:19 crc kubenswrapper[4740]: I0105 14:11:19.005411 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" path="/var/lib/kubelet/pods/6a8de10c-cfde-436d-bdc9-2a42c89ac2c2/volumes" Jan 05 14:11:19 crc kubenswrapper[4740]: I0105 14:11:19.007303 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-g9xk2" event={"ID":"2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb","Type":"ContainerDied","Data":"ffe2d450a85a81ea9c5cd3dc4b50eb372af920815808acaedbd629bcec750e21"} Jan 05 14:11:19 crc kubenswrapper[4740]: I0105 14:11:19.007348 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe2d450a85a81ea9c5cd3dc4b50eb372af920815808acaedbd629bcec750e21" Jan 05 14:11:19 crc kubenswrapper[4740]: I0105 14:11:19.007368 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerStarted","Data":"94d364921ae859657f2214c898ec662c532a83578ee58f8ce9202889c24f7ffc"} Jan 05 14:11:20 crc kubenswrapper[4740]: I0105 14:11:20.031325 4740 generic.go:334] "Generic (PLEG): container finished" podID="6eb2618b-5bb1-4caf-a6f9-708cddcbf390" containerID="83a35e5626bd0af0822fc7d563c2a07eae40ef5243d0cecbe9a448d003703d5a" exitCode=0 Jan 05 14:11:20 crc kubenswrapper[4740]: I0105 14:11:20.031451 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-25blk" event={"ID":"6eb2618b-5bb1-4caf-a6f9-708cddcbf390","Type":"ContainerDied","Data":"83a35e5626bd0af0822fc7d563c2a07eae40ef5243d0cecbe9a448d003703d5a"} Jan 05 14:11:20 crc kubenswrapper[4740]: I0105 14:11:20.035240 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerStarted","Data":"365fea8618eecca354e5e4b6e4d273c3fbbc2eb8ab145c780954fcca286112d1"} Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.059184 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerStarted","Data":"9f173637b4d1f60b2d4059d9dbe6857940bf3cce7aae937a6e07e7a281039950"} Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.407307 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-c5pxn"] Jan 05 14:11:21 crc kubenswrapper[4740]: E0105 14:11:21.407825 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" containerName="nova-manage" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.407838 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" containerName="nova-manage" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.408118 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" containerName="nova-manage" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.408894 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.458133 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-b00e-account-create-update-l22g4"] Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.460501 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-c5pxn"] Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.460621 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.463026 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.492651 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9srs\" (UniqueName: \"kubernetes.io/projected/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-kube-api-access-m9srs\") pod \"aodh-b00e-account-create-update-l22g4\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.492740 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac66a050-f6b1-49d7-8043-6bcd01940889-operator-scripts\") pod \"aodh-db-create-c5pxn\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.492794 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-operator-scripts\") pod \"aodh-b00e-account-create-update-l22g4\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.492975 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxg8n\" (UniqueName: \"kubernetes.io/projected/ac66a050-f6b1-49d7-8043-6bcd01940889-kube-api-access-cxg8n\") pod \"aodh-db-create-c5pxn\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.502380 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b00e-account-create-update-l22g4"] Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.594147 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9srs\" (UniqueName: \"kubernetes.io/projected/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-kube-api-access-m9srs\") pod \"aodh-b00e-account-create-update-l22g4\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.594201 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac66a050-f6b1-49d7-8043-6bcd01940889-operator-scripts\") pod \"aodh-db-create-c5pxn\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.594232 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-operator-scripts\") pod \"aodh-b00e-account-create-update-l22g4\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.594313 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxg8n\" (UniqueName: \"kubernetes.io/projected/ac66a050-f6b1-49d7-8043-6bcd01940889-kube-api-access-cxg8n\") pod \"aodh-db-create-c5pxn\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.594952 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac66a050-f6b1-49d7-8043-6bcd01940889-operator-scripts\") pod \"aodh-db-create-c5pxn\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.595629 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-operator-scripts\") pod \"aodh-b00e-account-create-update-l22g4\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.613004 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9srs\" (UniqueName: \"kubernetes.io/projected/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-kube-api-access-m9srs\") pod \"aodh-b00e-account-create-update-l22g4\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.616538 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxg8n\" (UniqueName: \"kubernetes.io/projected/ac66a050-f6b1-49d7-8043-6bcd01940889-kube-api-access-cxg8n\") pod \"aodh-db-create-c5pxn\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.700631 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.741660 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.799499 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.800296 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-combined-ca-bundle\") pod \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.800432 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r78xf\" (UniqueName: \"kubernetes.io/projected/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-kube-api-access-r78xf\") pod \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.800535 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-config-data\") pod \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.800606 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-scripts\") pod \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\" (UID: \"6eb2618b-5bb1-4caf-a6f9-708cddcbf390\") " Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.804669 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-kube-api-access-r78xf" (OuterVolumeSpecName: "kube-api-access-r78xf") pod "6eb2618b-5bb1-4caf-a6f9-708cddcbf390" (UID: "6eb2618b-5bb1-4caf-a6f9-708cddcbf390"). InnerVolumeSpecName "kube-api-access-r78xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.840012 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-scripts" (OuterVolumeSpecName: "scripts") pod "6eb2618b-5bb1-4caf-a6f9-708cddcbf390" (UID: "6eb2618b-5bb1-4caf-a6f9-708cddcbf390"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.857778 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eb2618b-5bb1-4caf-a6f9-708cddcbf390" (UID: "6eb2618b-5bb1-4caf-a6f9-708cddcbf390"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.870853 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-config-data" (OuterVolumeSpecName: "config-data") pod "6eb2618b-5bb1-4caf-a6f9-708cddcbf390" (UID: "6eb2618b-5bb1-4caf-a6f9-708cddcbf390"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.908678 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.908713 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r78xf\" (UniqueName: \"kubernetes.io/projected/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-kube-api-access-r78xf\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.908725 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.908732 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb2618b-5bb1-4caf-a6f9-708cddcbf390-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:21 crc kubenswrapper[4740]: I0105 14:11:21.968867 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-688b9f5b49-dfp7f" podUID="6a8de10c-cfde-436d-bdc9-2a42c89ac2c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.217:5353: i/o timeout" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.101384 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-25blk" event={"ID":"6eb2618b-5bb1-4caf-a6f9-708cddcbf390","Type":"ContainerDied","Data":"0900e907a9e5df79c3b1e4ffdbd4352986ef809763a88ca1ee8fb5ffd6fe9d33"} Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.101632 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0900e907a9e5df79c3b1e4ffdbd4352986ef809763a88ca1ee8fb5ffd6fe9d33" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.101724 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-25blk" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.161152 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 05 14:11:22 crc kubenswrapper[4740]: E0105 14:11:22.161816 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb2618b-5bb1-4caf-a6f9-708cddcbf390" containerName="nova-cell1-conductor-db-sync" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.161834 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb2618b-5bb1-4caf-a6f9-708cddcbf390" containerName="nova-cell1-conductor-db-sync" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.162055 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb2618b-5bb1-4caf-a6f9-708cddcbf390" containerName="nova-cell1-conductor-db-sync" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.162886 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.169527 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.176409 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.215570 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a384d1a-1719-4e7c-b0fc-7a15242711f0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.215657 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a384d1a-1719-4e7c-b0fc-7a15242711f0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.215734 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pdql\" (UniqueName: \"kubernetes.io/projected/4a384d1a-1719-4e7c-b0fc-7a15242711f0-kube-api-access-4pdql\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.317857 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a384d1a-1719-4e7c-b0fc-7a15242711f0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.317966 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a384d1a-1719-4e7c-b0fc-7a15242711f0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.318055 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pdql\" (UniqueName: \"kubernetes.io/projected/4a384d1a-1719-4e7c-b0fc-7a15242711f0-kube-api-access-4pdql\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.326858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a384d1a-1719-4e7c-b0fc-7a15242711f0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.326951 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a384d1a-1719-4e7c-b0fc-7a15242711f0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.342561 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pdql\" (UniqueName: \"kubernetes.io/projected/4a384d1a-1719-4e7c-b0fc-7a15242711f0-kube-api-access-4pdql\") pod \"nova-cell1-conductor-0\" (UID: \"4a384d1a-1719-4e7c-b0fc-7a15242711f0\") " pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: W0105 14:11:22.466191 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac66a050_f6b1_49d7_8043_6bcd01940889.slice/crio-bcdc77c5facb8b527f0112d9a74d5cc43cd816183e9fe2242b2047a67ac25b9c WatchSource:0}: Error finding container bcdc77c5facb8b527f0112d9a74d5cc43cd816183e9fe2242b2047a67ac25b9c: Status 404 returned error can't find the container with id bcdc77c5facb8b527f0112d9a74d5cc43cd816183e9fe2242b2047a67ac25b9c Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.485700 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-c5pxn"] Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.509352 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:22 crc kubenswrapper[4740]: I0105 14:11:22.586579 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b00e-account-create-update-l22g4"] Jan 05 14:11:23 crc kubenswrapper[4740]: I0105 14:11:23.112369 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c5pxn" event={"ID":"ac66a050-f6b1-49d7-8043-6bcd01940889","Type":"ContainerStarted","Data":"bcdc77c5facb8b527f0112d9a74d5cc43cd816183e9fe2242b2047a67ac25b9c"} Jan 05 14:11:23 crc kubenswrapper[4740]: I0105 14:11:23.114054 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b00e-account-create-update-l22g4" event={"ID":"d54ed0de-5ddc-460f-a94e-f8223e8d36e9","Type":"ContainerStarted","Data":"9259dac4b540c26e3777ee7b6766f6f6e66120b7397c49506435d2ac018bc235"} Jan 05 14:11:23 crc kubenswrapper[4740]: I0105 14:11:23.630353 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.129117 4740 generic.go:334] "Generic (PLEG): container finished" podID="ac66a050-f6b1-49d7-8043-6bcd01940889" containerID="5972087ebfa046e8b0d8dbf9630de93aebaadc4ede2f0ea6a5d4e8ee0499d9d7" exitCode=0 Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.129215 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c5pxn" event={"ID":"ac66a050-f6b1-49d7-8043-6bcd01940889","Type":"ContainerDied","Data":"5972087ebfa046e8b0d8dbf9630de93aebaadc4ede2f0ea6a5d4e8ee0499d9d7"} Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.135357 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4a384d1a-1719-4e7c-b0fc-7a15242711f0","Type":"ContainerStarted","Data":"c6ed6f32354a018bf50b3a6abc5223bb3902f9af526d14f75d5eb2451eb9d4de"} Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.135423 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4a384d1a-1719-4e7c-b0fc-7a15242711f0","Type":"ContainerStarted","Data":"a86f56cec4367be23781a824dc2b5f2e68fe25b41f4cf7c655cef9bbb7e057c8"} Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.135905 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.137103 4740 generic.go:334] "Generic (PLEG): container finished" podID="d54ed0de-5ddc-460f-a94e-f8223e8d36e9" containerID="23d7a406a4470fe12f4b1940ed029315efec014f896a337fbfbcd31ed88a4b2f" exitCode=0 Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.137183 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b00e-account-create-update-l22g4" event={"ID":"d54ed0de-5ddc-460f-a94e-f8223e8d36e9","Type":"ContainerDied","Data":"23d7a406a4470fe12f4b1940ed029315efec014f896a337fbfbcd31ed88a4b2f"} Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.140111 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerStarted","Data":"6a3041246c8e5d512003df850eb5d2c71ec5529262db1a5e959844aab4c569bb"} Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.173764 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.173736617 podStartE2EDuration="2.173736617s" podCreationTimestamp="2026-01-05 14:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:24.15972222 +0000 UTC m=+1333.466630829" watchObservedRunningTime="2026-01-05 14:11:24.173736617 +0000 UTC m=+1333.480645206" Jan 05 14:11:24 crc kubenswrapper[4740]: I0105 14:11:24.340738 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 05 14:11:25 crc kubenswrapper[4740]: I0105 14:11:25.856051 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:25 crc kubenswrapper[4740]: I0105 14:11:25.863015 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.020668 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9srs\" (UniqueName: \"kubernetes.io/projected/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-kube-api-access-m9srs\") pod \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.020881 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac66a050-f6b1-49d7-8043-6bcd01940889-operator-scripts\") pod \"ac66a050-f6b1-49d7-8043-6bcd01940889\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.020967 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-operator-scripts\") pod \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\" (UID: \"d54ed0de-5ddc-460f-a94e-f8223e8d36e9\") " Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.021049 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxg8n\" (UniqueName: \"kubernetes.io/projected/ac66a050-f6b1-49d7-8043-6bcd01940889-kube-api-access-cxg8n\") pod \"ac66a050-f6b1-49d7-8043-6bcd01940889\" (UID: \"ac66a050-f6b1-49d7-8043-6bcd01940889\") " Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.022271 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac66a050-f6b1-49d7-8043-6bcd01940889-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac66a050-f6b1-49d7-8043-6bcd01940889" (UID: "ac66a050-f6b1-49d7-8043-6bcd01940889"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.028774 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-kube-api-access-m9srs" (OuterVolumeSpecName: "kube-api-access-m9srs") pod "d54ed0de-5ddc-460f-a94e-f8223e8d36e9" (UID: "d54ed0de-5ddc-460f-a94e-f8223e8d36e9"). InnerVolumeSpecName "kube-api-access-m9srs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.029138 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac66a050-f6b1-49d7-8043-6bcd01940889-kube-api-access-cxg8n" (OuterVolumeSpecName: "kube-api-access-cxg8n") pod "ac66a050-f6b1-49d7-8043-6bcd01940889" (UID: "ac66a050-f6b1-49d7-8043-6bcd01940889"). InnerVolumeSpecName "kube-api-access-cxg8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.030575 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d54ed0de-5ddc-460f-a94e-f8223e8d36e9" (UID: "d54ed0de-5ddc-460f-a94e-f8223e8d36e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.123827 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac66a050-f6b1-49d7-8043-6bcd01940889-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.123872 4740 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.123889 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxg8n\" (UniqueName: \"kubernetes.io/projected/ac66a050-f6b1-49d7-8043-6bcd01940889-kube-api-access-cxg8n\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.123903 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9srs\" (UniqueName: \"kubernetes.io/projected/d54ed0de-5ddc-460f-a94e-f8223e8d36e9-kube-api-access-m9srs\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.163049 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-c5pxn" event={"ID":"ac66a050-f6b1-49d7-8043-6bcd01940889","Type":"ContainerDied","Data":"bcdc77c5facb8b527f0112d9a74d5cc43cd816183e9fe2242b2047a67ac25b9c"} Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.163411 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcdc77c5facb8b527f0112d9a74d5cc43cd816183e9fe2242b2047a67ac25b9c" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.163113 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-c5pxn" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.164937 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b00e-account-create-update-l22g4" event={"ID":"d54ed0de-5ddc-460f-a94e-f8223e8d36e9","Type":"ContainerDied","Data":"9259dac4b540c26e3777ee7b6766f6f6e66120b7397c49506435d2ac018bc235"} Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.165000 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9259dac4b540c26e3777ee7b6766f6f6e66120b7397c49506435d2ac018bc235" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.165092 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b00e-account-create-update-l22g4" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.176203 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerStarted","Data":"fc80cc70cbaabd6ef7a790b47123635889329a7e49ec24ad3b0c81473aef210f"} Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.176408 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:11:26 crc kubenswrapper[4740]: I0105 14:11:26.203915 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.036709765 podStartE2EDuration="8.203897054s" podCreationTimestamp="2026-01-05 14:11:18 +0000 UTC" firstStartedPulling="2026-01-05 14:11:18.911373921 +0000 UTC m=+1328.218282510" lastFinishedPulling="2026-01-05 14:11:25.07856122 +0000 UTC m=+1334.385469799" observedRunningTime="2026-01-05 14:11:26.198427677 +0000 UTC m=+1335.505336266" watchObservedRunningTime="2026-01-05 14:11:26.203897054 +0000 UTC m=+1335.510805633" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.743702 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-lwqt9"] Jan 05 14:11:31 crc kubenswrapper[4740]: E0105 14:11:31.744875 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac66a050-f6b1-49d7-8043-6bcd01940889" containerName="mariadb-database-create" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.744891 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac66a050-f6b1-49d7-8043-6bcd01940889" containerName="mariadb-database-create" Jan 05 14:11:31 crc kubenswrapper[4740]: E0105 14:11:31.744906 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54ed0de-5ddc-460f-a94e-f8223e8d36e9" containerName="mariadb-account-create-update" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.744915 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54ed0de-5ddc-460f-a94e-f8223e8d36e9" containerName="mariadb-account-create-update" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.745236 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54ed0de-5ddc-460f-a94e-f8223e8d36e9" containerName="mariadb-account-create-update" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.745264 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac66a050-f6b1-49d7-8043-6bcd01940889" containerName="mariadb-database-create" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.746178 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.751271 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.751404 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.751727 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.751962 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f8s9l" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.756853 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-lwqt9"] Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.877633 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbpt\" (UniqueName: \"kubernetes.io/projected/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-kube-api-access-nrbpt\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.877699 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-scripts\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.877904 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-config-data\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.878223 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-combined-ca-bundle\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.980781 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-combined-ca-bundle\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.981025 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbpt\" (UniqueName: \"kubernetes.io/projected/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-kube-api-access-nrbpt\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.981108 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-scripts\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.981236 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-config-data\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.986708 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-scripts\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.988511 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-config-data\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:31 crc kubenswrapper[4740]: I0105 14:11:31.994516 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-combined-ca-bundle\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:32 crc kubenswrapper[4740]: I0105 14:11:32.013840 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbpt\" (UniqueName: \"kubernetes.io/projected/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-kube-api-access-nrbpt\") pod \"aodh-db-sync-lwqt9\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:32 crc kubenswrapper[4740]: I0105 14:11:32.087655 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:32 crc kubenswrapper[4740]: I0105 14:11:32.565529 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 05 14:11:32 crc kubenswrapper[4740]: I0105 14:11:32.650905 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-lwqt9"] Jan 05 14:11:33 crc kubenswrapper[4740]: I0105 14:11:33.280670 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lwqt9" event={"ID":"9724d8fc-2e73-46f4-b3ae-357a4c3e8313","Type":"ContainerStarted","Data":"980a9b759755b7feea0269392ace93abaa2583defb8eaf8806b3a41b9462ba17"} Jan 05 14:11:36 crc kubenswrapper[4740]: I0105 14:11:36.134809 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 05 14:11:36 crc kubenswrapper[4740]: I0105 14:11:36.135565 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 05 14:11:38 crc kubenswrapper[4740]: I0105 14:11:38.347021 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lwqt9" event={"ID":"9724d8fc-2e73-46f4-b3ae-357a4c3e8313","Type":"ContainerStarted","Data":"5543ab60f4ffc93cdbbd1acf5198b760b3d4c50c32afb6e037d5961782aac2d2"} Jan 05 14:11:38 crc kubenswrapper[4740]: I0105 14:11:38.387918 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-lwqt9" podStartSLOduration=2.841655274 podStartE2EDuration="7.387896142s" podCreationTimestamp="2026-01-05 14:11:31 +0000 UTC" firstStartedPulling="2026-01-05 14:11:32.646762854 +0000 UTC m=+1341.953671433" lastFinishedPulling="2026-01-05 14:11:37.193003712 +0000 UTC m=+1346.499912301" observedRunningTime="2026-01-05 14:11:38.365234914 +0000 UTC m=+1347.672143573" watchObservedRunningTime="2026-01-05 14:11:38.387896142 +0000 UTC m=+1347.694804721" Jan 05 14:11:40 crc kubenswrapper[4740]: I0105 14:11:40.376882 4740 generic.go:334] "Generic (PLEG): container finished" podID="9724d8fc-2e73-46f4-b3ae-357a4c3e8313" containerID="5543ab60f4ffc93cdbbd1acf5198b760b3d4c50c32afb6e037d5961782aac2d2" exitCode=0 Jan 05 14:11:40 crc kubenswrapper[4740]: I0105 14:11:40.376991 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lwqt9" event={"ID":"9724d8fc-2e73-46f4-b3ae-357a4c3e8313","Type":"ContainerDied","Data":"5543ab60f4ffc93cdbbd1acf5198b760b3d4c50c32afb6e037d5961782aac2d2"} Jan 05 14:11:41 crc kubenswrapper[4740]: I0105 14:11:41.944959 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.110598 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-scripts\") pod \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.110637 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-combined-ca-bundle\") pod \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.110785 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-config-data\") pod \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.110948 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrbpt\" (UniqueName: \"kubernetes.io/projected/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-kube-api-access-nrbpt\") pod \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\" (UID: \"9724d8fc-2e73-46f4-b3ae-357a4c3e8313\") " Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.116418 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-scripts" (OuterVolumeSpecName: "scripts") pod "9724d8fc-2e73-46f4-b3ae-357a4c3e8313" (UID: "9724d8fc-2e73-46f4-b3ae-357a4c3e8313"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.132736 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-kube-api-access-nrbpt" (OuterVolumeSpecName: "kube-api-access-nrbpt") pod "9724d8fc-2e73-46f4-b3ae-357a4c3e8313" (UID: "9724d8fc-2e73-46f4-b3ae-357a4c3e8313"). InnerVolumeSpecName "kube-api-access-nrbpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.145318 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9724d8fc-2e73-46f4-b3ae-357a4c3e8313" (UID: "9724d8fc-2e73-46f4-b3ae-357a4c3e8313"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.147260 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-config-data" (OuterVolumeSpecName: "config-data") pod "9724d8fc-2e73-46f4-b3ae-357a4c3e8313" (UID: "9724d8fc-2e73-46f4-b3ae-357a4c3e8313"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.216049 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.216096 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.216109 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.216119 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrbpt\" (UniqueName: \"kubernetes.io/projected/9724d8fc-2e73-46f4-b3ae-357a4c3e8313-kube-api-access-nrbpt\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.415807 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lwqt9" event={"ID":"9724d8fc-2e73-46f4-b3ae-357a4c3e8313","Type":"ContainerDied","Data":"980a9b759755b7feea0269392ace93abaa2583defb8eaf8806b3a41b9462ba17"} Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.415867 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="980a9b759755b7feea0269392ace93abaa2583defb8eaf8806b3a41b9462ba17" Jan 05 14:11:42 crc kubenswrapper[4740]: I0105 14:11:42.415979 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lwqt9" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.419538 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.438731 4740 generic.go:334] "Generic (PLEG): container finished" podID="62896e4a-182c-414b-a31f-0890587c3498" containerID="d95b3e320a1f06d72f5929bd0ae0d85f655d519b1c91c50cd6160b05bced811f" exitCode=137 Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.438806 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62896e4a-182c-414b-a31f-0890587c3498","Type":"ContainerDied","Data":"d95b3e320a1f06d72f5929bd0ae0d85f655d519b1c91c50cd6160b05bced811f"} Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.458447 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b79vk\" (UniqueName: \"kubernetes.io/projected/de290cbf-7ff1-423b-b2d2-b7232b2165e4-kube-api-access-b79vk\") pod \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.458605 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-config-data\") pod \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.458772 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de290cbf-7ff1-423b-b2d2-b7232b2165e4-logs\") pod \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.458813 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-combined-ca-bundle\") pod \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\" (UID: \"de290cbf-7ff1-423b-b2d2-b7232b2165e4\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.459869 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de290cbf-7ff1-423b-b2d2-b7232b2165e4-logs" (OuterVolumeSpecName: "logs") pod "de290cbf-7ff1-423b-b2d2-b7232b2165e4" (UID: "de290cbf-7ff1-423b-b2d2-b7232b2165e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.464175 4740 generic.go:334] "Generic (PLEG): container finished" podID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerID="bc865f45ae6bf992b6a2809d4faed867c29ce884f712346dfc72c74f9a7c4334" exitCode=137 Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.464232 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86f37a8e-5506-4845-ae5b-a71374c00a5f","Type":"ContainerDied","Data":"bc865f45ae6bf992b6a2809d4faed867c29ce884f712346dfc72c74f9a7c4334"} Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.465477 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de290cbf-7ff1-423b-b2d2-b7232b2165e4-kube-api-access-b79vk" (OuterVolumeSpecName: "kube-api-access-b79vk") pod "de290cbf-7ff1-423b-b2d2-b7232b2165e4" (UID: "de290cbf-7ff1-423b-b2d2-b7232b2165e4"). InnerVolumeSpecName "kube-api-access-b79vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.469421 4740 generic.go:334] "Generic (PLEG): container finished" podID="34b94c66-6d17-44d2-acb7-c9033986fedb" containerID="da8722ae38c11d4d22d5248a4269653e2ca7438958f74516b2bdbc138e16d8fc" exitCode=137 Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.469511 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34b94c66-6d17-44d2-acb7-c9033986fedb","Type":"ContainerDied","Data":"da8722ae38c11d4d22d5248a4269653e2ca7438958f74516b2bdbc138e16d8fc"} Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.472997 4740 generic.go:334] "Generic (PLEG): container finished" podID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerID="38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89" exitCode=137 Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.473032 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de290cbf-7ff1-423b-b2d2-b7232b2165e4","Type":"ContainerDied","Data":"38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89"} Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.473053 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de290cbf-7ff1-423b-b2d2-b7232b2165e4","Type":"ContainerDied","Data":"b91d5055f69aa67f4b97509ebe2f77d45cea9a1f02f7584391fb713344b3fae8"} Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.473057 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.473084 4740 scope.go:117] "RemoveContainer" containerID="38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.511652 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-config-data" (OuterVolumeSpecName: "config-data") pod "de290cbf-7ff1-423b-b2d2-b7232b2165e4" (UID: "de290cbf-7ff1-423b-b2d2-b7232b2165e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.542632 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de290cbf-7ff1-423b-b2d2-b7232b2165e4" (UID: "de290cbf-7ff1-423b-b2d2-b7232b2165e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.552110 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.561543 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b79vk\" (UniqueName: \"kubernetes.io/projected/de290cbf-7ff1-423b-b2d2-b7232b2165e4-kube-api-access-b79vk\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.561821 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.561882 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de290cbf-7ff1-423b-b2d2-b7232b2165e4-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.561933 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de290cbf-7ff1-423b-b2d2-b7232b2165e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.570497 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.580519 4740 scope.go:117] "RemoveContainer" containerID="2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.583728 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.620275 4740 scope.go:117] "RemoveContainer" containerID="38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.620753 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89\": container with ID starting with 38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89 not found: ID does not exist" containerID="38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.620938 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89"} err="failed to get container status \"38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89\": rpc error: code = NotFound desc = could not find container \"38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89\": container with ID starting with 38f7c50b383da5e18d7341c2f5251104f8f5651cc48471a168f96b2790d31c89 not found: ID does not exist" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.621118 4740 scope.go:117] "RemoveContainer" containerID="2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.621623 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c\": container with ID starting with 2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c not found: ID does not exist" containerID="2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.621671 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c"} err="failed to get container status \"2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c\": rpc error: code = NotFound desc = could not find container \"2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c\": container with ID starting with 2e8dfccdd1db04086c1f5be47bb90626a39bab2ecdcadd79c97b60e17c7e9d0c not found: ID does not exist" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663050 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-combined-ca-bundle\") pod \"62896e4a-182c-414b-a31f-0890587c3498\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663189 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmq5x\" (UniqueName: \"kubernetes.io/projected/34b94c66-6d17-44d2-acb7-c9033986fedb-kube-api-access-qmq5x\") pod \"34b94c66-6d17-44d2-acb7-c9033986fedb\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663275 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-config-data\") pod \"34b94c66-6d17-44d2-acb7-c9033986fedb\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663326 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86f37a8e-5506-4845-ae5b-a71374c00a5f-logs\") pod \"86f37a8e-5506-4845-ae5b-a71374c00a5f\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663350 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpszf\" (UniqueName: \"kubernetes.io/projected/86f37a8e-5506-4845-ae5b-a71374c00a5f-kube-api-access-wpszf\") pod \"86f37a8e-5506-4845-ae5b-a71374c00a5f\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663446 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kww67\" (UniqueName: \"kubernetes.io/projected/62896e4a-182c-414b-a31f-0890587c3498-kube-api-access-kww67\") pod \"62896e4a-182c-414b-a31f-0890587c3498\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663507 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-config-data\") pod \"62896e4a-182c-414b-a31f-0890587c3498\" (UID: \"62896e4a-182c-414b-a31f-0890587c3498\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663549 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-config-data\") pod \"86f37a8e-5506-4845-ae5b-a71374c00a5f\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663581 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-combined-ca-bundle\") pod \"86f37a8e-5506-4845-ae5b-a71374c00a5f\" (UID: \"86f37a8e-5506-4845-ae5b-a71374c00a5f\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.663607 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-combined-ca-bundle\") pod \"34b94c66-6d17-44d2-acb7-c9033986fedb\" (UID: \"34b94c66-6d17-44d2-acb7-c9033986fedb\") " Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.665235 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f37a8e-5506-4845-ae5b-a71374c00a5f-logs" (OuterVolumeSpecName: "logs") pod "86f37a8e-5506-4845-ae5b-a71374c00a5f" (UID: "86f37a8e-5506-4845-ae5b-a71374c00a5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.667901 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b94c66-6d17-44d2-acb7-c9033986fedb-kube-api-access-qmq5x" (OuterVolumeSpecName: "kube-api-access-qmq5x") pod "34b94c66-6d17-44d2-acb7-c9033986fedb" (UID: "34b94c66-6d17-44d2-acb7-c9033986fedb"). InnerVolumeSpecName "kube-api-access-qmq5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.668049 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f37a8e-5506-4845-ae5b-a71374c00a5f-kube-api-access-wpszf" (OuterVolumeSpecName: "kube-api-access-wpszf") pod "86f37a8e-5506-4845-ae5b-a71374c00a5f" (UID: "86f37a8e-5506-4845-ae5b-a71374c00a5f"). InnerVolumeSpecName "kube-api-access-wpszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.669456 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62896e4a-182c-414b-a31f-0890587c3498-kube-api-access-kww67" (OuterVolumeSpecName: "kube-api-access-kww67") pod "62896e4a-182c-414b-a31f-0890587c3498" (UID: "62896e4a-182c-414b-a31f-0890587c3498"). InnerVolumeSpecName "kube-api-access-kww67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.694886 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62896e4a-182c-414b-a31f-0890587c3498" (UID: "62896e4a-182c-414b-a31f-0890587c3498"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.695435 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86f37a8e-5506-4845-ae5b-a71374c00a5f" (UID: "86f37a8e-5506-4845-ae5b-a71374c00a5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.695900 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-config-data" (OuterVolumeSpecName: "config-data") pod "86f37a8e-5506-4845-ae5b-a71374c00a5f" (UID: "86f37a8e-5506-4845-ae5b-a71374c00a5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.696836 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34b94c66-6d17-44d2-acb7-c9033986fedb" (UID: "34b94c66-6d17-44d2-acb7-c9033986fedb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.707227 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-config-data" (OuterVolumeSpecName: "config-data") pod "34b94c66-6d17-44d2-acb7-c9033986fedb" (UID: "34b94c66-6d17-44d2-acb7-c9033986fedb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.709206 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-config-data" (OuterVolumeSpecName: "config-data") pod "62896e4a-182c-414b-a31f-0890587c3498" (UID: "62896e4a-182c-414b-a31f-0890587c3498"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.765999 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmq5x\" (UniqueName: \"kubernetes.io/projected/34b94c66-6d17-44d2-acb7-c9033986fedb-kube-api-access-qmq5x\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766034 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766049 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86f37a8e-5506-4845-ae5b-a71374c00a5f-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766057 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpszf\" (UniqueName: \"kubernetes.io/projected/86f37a8e-5506-4845-ae5b-a71374c00a5f-kube-api-access-wpszf\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766083 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kww67\" (UniqueName: \"kubernetes.io/projected/62896e4a-182c-414b-a31f-0890587c3498-kube-api-access-kww67\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766093 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766101 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766108 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f37a8e-5506-4845-ae5b-a71374c00a5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766116 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b94c66-6d17-44d2-acb7-c9033986fedb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.766125 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62896e4a-182c-414b-a31f-0890587c3498-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.819121 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.845951 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.861464 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862206 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724d8fc-2e73-46f4-b3ae-357a4c3e8313" containerName="aodh-db-sync" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862232 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724d8fc-2e73-46f4-b3ae-357a4c3e8313" containerName="aodh-db-sync" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862258 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-metadata" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862268 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-metadata" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862281 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-log" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862290 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-log" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862311 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-api" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862316 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-api" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862326 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b94c66-6d17-44d2-acb7-c9033986fedb" containerName="nova-cell1-novncproxy-novncproxy" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862334 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b94c66-6d17-44d2-acb7-c9033986fedb" containerName="nova-cell1-novncproxy-novncproxy" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862358 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-log" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862365 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-log" Jan 05 14:11:43 crc kubenswrapper[4740]: E0105 14:11:43.862377 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62896e4a-182c-414b-a31f-0890587c3498" containerName="nova-scheduler-scheduler" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862383 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="62896e4a-182c-414b-a31f-0890587c3498" containerName="nova-scheduler-scheduler" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862639 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="62896e4a-182c-414b-a31f-0890587c3498" containerName="nova-scheduler-scheduler" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862651 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-api" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862667 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b94c66-6d17-44d2-acb7-c9033986fedb" containerName="nova-cell1-novncproxy-novncproxy" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862675 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-log" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862692 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9724d8fc-2e73-46f4-b3ae-357a4c3e8313" containerName="aodh-db-sync" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862704 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" containerName="nova-metadata-metadata" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.862718 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" containerName="nova-api-log" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.863915 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.865627 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.868340 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.880045 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.970469 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.970824 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f347cb-9580-4d54-adef-f3a06d466739-logs\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.970931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wpw\" (UniqueName: \"kubernetes.io/projected/62f347cb-9580-4d54-adef-f3a06d466739-kube-api-access-v9wpw\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.971029 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-config-data\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:43 crc kubenswrapper[4740]: I0105 14:11:43.971127 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.073025 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.073313 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.073345 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f347cb-9580-4d54-adef-f3a06d466739-logs\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.073431 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wpw\" (UniqueName: \"kubernetes.io/projected/62f347cb-9580-4d54-adef-f3a06d466739-kube-api-access-v9wpw\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.073498 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-config-data\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.075373 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f347cb-9580-4d54-adef-f3a06d466739-logs\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.077988 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.078116 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.078769 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-config-data\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.090381 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wpw\" (UniqueName: \"kubernetes.io/projected/62f347cb-9580-4d54-adef-f3a06d466739-kube-api-access-v9wpw\") pod \"nova-metadata-0\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.190532 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.491874 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62896e4a-182c-414b-a31f-0890587c3498","Type":"ContainerDied","Data":"81f43bed880c54a2e3fa7d58850402e33c9299d88fac6403e7d1c6c3cbe64a53"} Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.492122 4740 scope.go:117] "RemoveContainer" containerID="d95b3e320a1f06d72f5929bd0ae0d85f655d519b1c91c50cd6160b05bced811f" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.491919 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.495770 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86f37a8e-5506-4845-ae5b-a71374c00a5f","Type":"ContainerDied","Data":"d0e05826b308f443f6bff3fc2203123939b8bd1e697c4d781d697448b636b24e"} Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.495879 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.502633 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34b94c66-6d17-44d2-acb7-c9033986fedb","Type":"ContainerDied","Data":"a70580802567064d454a9e78d85064c2e779396b34a0ab621b2c54e6cb9c6c90"} Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.502729 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.529561 4740 scope.go:117] "RemoveContainer" containerID="bc865f45ae6bf992b6a2809d4faed867c29ce884f712346dfc72c74f9a7c4334" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.558112 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.588468 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.603820 4740 scope.go:117] "RemoveContainer" containerID="2de6ea5db28ce1799e9e0aaa0cd943eae5c6a962aad50c42d163f9ccf6a23954" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.626122 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.628056 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.638691 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.638967 4740 scope.go:117] "RemoveContainer" containerID="da8722ae38c11d4d22d5248a4269653e2ca7438958f74516b2bdbc138e16d8fc" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.647028 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.661645 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.690790 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.691463 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.691583 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqd28\" (UniqueName: \"kubernetes.io/projected/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-kube-api-access-sqd28\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.691721 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-config-data\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.706208 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.719131 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.729880 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.732287 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.734393 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.743022 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.744633 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.747564 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.747769 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.747915 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.764241 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.775831 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.793214 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hddcm\" (UniqueName: \"kubernetes.io/projected/2ad0da6b-b746-4181-b51a-913721d967e3-kube-api-access-hddcm\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.793487 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxmr\" (UniqueName: \"kubernetes.io/projected/8ae73796-c5c6-47f2-a40c-481842c4dbdd-kube-api-access-dbxmr\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.793557 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.793692 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.793742 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794120 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794195 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqd28\" (UniqueName: \"kubernetes.io/projected/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-kube-api-access-sqd28\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794254 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794521 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794580 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-config-data\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794603 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae73796-c5c6-47f2-a40c-481842c4dbdd-logs\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.794758 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-config-data\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.798842 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.800665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-config-data\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.800917 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.810719 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqd28\" (UniqueName: \"kubernetes.io/projected/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-kube-api-access-sqd28\") pod \"nova-scheduler-0\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896304 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896341 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896407 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896431 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-config-data\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896448 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae73796-c5c6-47f2-a40c-481842c4dbdd-logs\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896522 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hddcm\" (UniqueName: \"kubernetes.io/projected/2ad0da6b-b746-4181-b51a-913721d967e3-kube-api-access-hddcm\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896543 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxmr\" (UniqueName: \"kubernetes.io/projected/8ae73796-c5c6-47f2-a40c-481842c4dbdd-kube-api-access-dbxmr\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896581 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.896615 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.897246 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae73796-c5c6-47f2-a40c-481842c4dbdd-logs\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.899874 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.900366 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.902028 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-config-data\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.902036 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.902796 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.912516 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ad0da6b-b746-4181-b51a-913721d967e3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.913736 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hddcm\" (UniqueName: \"kubernetes.io/projected/2ad0da6b-b746-4181-b51a-913721d967e3-kube-api-access-hddcm\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ad0da6b-b746-4181-b51a-913721d967e3\") " pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.922022 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxmr\" (UniqueName: \"kubernetes.io/projected/8ae73796-c5c6-47f2-a40c-481842c4dbdd-kube-api-access-dbxmr\") pod \"nova-api-0\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " pod="openstack/nova-api-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.953106 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.985149 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b94c66-6d17-44d2-acb7-c9033986fedb" path="/var/lib/kubelet/pods/34b94c66-6d17-44d2-acb7-c9033986fedb/volumes" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.986217 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62896e4a-182c-414b-a31f-0890587c3498" path="/var/lib/kubelet/pods/62896e4a-182c-414b-a31f-0890587c3498/volumes" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.986792 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f37a8e-5506-4845-ae5b-a71374c00a5f" path="/var/lib/kubelet/pods/86f37a8e-5506-4845-ae5b-a71374c00a5f/volumes" Jan 05 14:11:44 crc kubenswrapper[4740]: I0105 14:11:44.987847 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de290cbf-7ff1-423b-b2d2-b7232b2165e4" path="/var/lib/kubelet/pods/de290cbf-7ff1-423b-b2d2-b7232b2165e4/volumes" Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.053775 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.070039 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:45 crc kubenswrapper[4740]: W0105 14:11:45.449164 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e33bb1c_f72d_4c4b_a7c2_76889d62e9e4.slice/crio-27eb9bd3883d92842d75c123b8b8b3b8022278b00c3d0e65c898150b3f497654 WatchSource:0}: Error finding container 27eb9bd3883d92842d75c123b8b8b3b8022278b00c3d0e65c898150b3f497654: Status 404 returned error can't find the container with id 27eb9bd3883d92842d75c123b8b8b3b8022278b00c3d0e65c898150b3f497654 Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.449499 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.532661 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4","Type":"ContainerStarted","Data":"27eb9bd3883d92842d75c123b8b8b3b8022278b00c3d0e65c898150b3f497654"} Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.537278 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62f347cb-9580-4d54-adef-f3a06d466739","Type":"ContainerStarted","Data":"6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd"} Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.537310 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62f347cb-9580-4d54-adef-f3a06d466739","Type":"ContainerStarted","Data":"dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f"} Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.537319 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62f347cb-9580-4d54-adef-f3a06d466739","Type":"ContainerStarted","Data":"8c9e0ab460a502fb39663c4c8a9f0a3ea9e9456e74802c62c596c4a9aaccd270"} Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.567545 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5675243610000003 podStartE2EDuration="2.567524361s" podCreationTimestamp="2026-01-05 14:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:45.565694692 +0000 UTC m=+1354.872603291" watchObservedRunningTime="2026-01-05 14:11:45.567524361 +0000 UTC m=+1354.874432940" Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.590676 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:11:45 crc kubenswrapper[4740]: W0105 14:11:45.591928 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae73796_c5c6_47f2_a40c_481842c4dbdd.slice/crio-57ed3793a6de1ae51b78de2b64565c03b5d50608e7ec8a7d5c2c498137c2d52c WatchSource:0}: Error finding container 57ed3793a6de1ae51b78de2b64565c03b5d50608e7ec8a7d5c2c498137c2d52c: Status 404 returned error can't find the container with id 57ed3793a6de1ae51b78de2b64565c03b5d50608e7ec8a7d5c2c498137c2d52c Jan 05 14:11:45 crc kubenswrapper[4740]: W0105 14:11:45.667523 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ad0da6b_b746_4181_b51a_913721d967e3.slice/crio-5fe02438a2e99013c81ebdb19109966e8f6467028bf3e6a770f4b52d4fdd307f WatchSource:0}: Error finding container 5fe02438a2e99013c81ebdb19109966e8f6467028bf3e6a770f4b52d4fdd307f: Status 404 returned error can't find the container with id 5fe02438a2e99013c81ebdb19109966e8f6467028bf3e6a770f4b52d4fdd307f Jan 05 14:11:45 crc kubenswrapper[4740]: I0105 14:11:45.672545 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.337504 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.343478 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.349209 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.378771 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.379242 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f8s9l" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.380230 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.430125 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.430496 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-scripts\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.430675 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-config-data\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.430894 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrqv\" (UniqueName: \"kubernetes.io/projected/b9ea403a-5716-4424-bfeb-61537e7c3969-kube-api-access-srrqv\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.533455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.534482 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-scripts\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.534608 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-config-data\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.534877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrqv\" (UniqueName: \"kubernetes.io/projected/b9ea403a-5716-4424-bfeb-61537e7c3969-kube-api-access-srrqv\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.554146 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.554670 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-scripts\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.557834 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-config-data\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.560775 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrqv\" (UniqueName: \"kubernetes.io/projected/b9ea403a-5716-4424-bfeb-61537e7c3969-kube-api-access-srrqv\") pod \"aodh-0\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " pod="openstack/aodh-0" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.573377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ae73796-c5c6-47f2-a40c-481842c4dbdd","Type":"ContainerStarted","Data":"b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823"} Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.573422 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ae73796-c5c6-47f2-a40c-481842c4dbdd","Type":"ContainerStarted","Data":"8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43"} Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.573432 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ae73796-c5c6-47f2-a40c-481842c4dbdd","Type":"ContainerStarted","Data":"57ed3793a6de1ae51b78de2b64565c03b5d50608e7ec8a7d5c2c498137c2d52c"} Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.584606 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ad0da6b-b746-4181-b51a-913721d967e3","Type":"ContainerStarted","Data":"44289bd531d793a18f7d5bf9542dbb55d717362a186029c0344ea9bb3569d313"} Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.584657 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ad0da6b-b746-4181-b51a-913721d967e3","Type":"ContainerStarted","Data":"5fe02438a2e99013c81ebdb19109966e8f6467028bf3e6a770f4b52d4fdd307f"} Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.591336 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4","Type":"ContainerStarted","Data":"ef314be233895a93e2ef22c97fbccac10047a6437e5f0c89615cf5aabd1dbc24"} Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.612135 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.612115427 podStartE2EDuration="2.612115427s" podCreationTimestamp="2026-01-05 14:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:46.591817182 +0000 UTC m=+1355.898725781" watchObservedRunningTime="2026-01-05 14:11:46.612115427 +0000 UTC m=+1355.919024006" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.632245 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.632225287 podStartE2EDuration="2.632225287s" podCreationTimestamp="2026-01-05 14:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:46.614772338 +0000 UTC m=+1355.921680917" watchObservedRunningTime="2026-01-05 14:11:46.632225287 +0000 UTC m=+1355.939133866" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.649443 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.649423318 podStartE2EDuration="2.649423318s" podCreationTimestamp="2026-01-05 14:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:46.63792767 +0000 UTC m=+1355.944836249" watchObservedRunningTime="2026-01-05 14:11:46.649423318 +0000 UTC m=+1355.956331907" Jan 05 14:11:46 crc kubenswrapper[4740]: I0105 14:11:46.700127 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:11:47 crc kubenswrapper[4740]: I0105 14:11:47.383057 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 05 14:11:47 crc kubenswrapper[4740]: I0105 14:11:47.601292 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerStarted","Data":"70cb164cfca68fd94866ff3ebd2ff0aa8b5c92c56a44ab6eb1ac019dcd176a42"} Jan 05 14:11:48 crc kubenswrapper[4740]: I0105 14:11:48.403394 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 05 14:11:48 crc kubenswrapper[4740]: I0105 14:11:48.615172 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerStarted","Data":"bb39eed9a7b3583dfe9836d671d14f898c313238abffbc81ada6878e3ed529df"} Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.191696 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.191800 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.211281 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.211517 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-central-agent" containerID="cri-o://365fea8618eecca354e5e4b6e4d273c3fbbc2eb8ab145c780954fcca286112d1" gracePeriod=30 Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.211615 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="proxy-httpd" containerID="cri-o://fc80cc70cbaabd6ef7a790b47123635889329a7e49ec24ad3b0c81473aef210f" gracePeriod=30 Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.211638 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-notification-agent" containerID="cri-o://9f173637b4d1f60b2d4059d9dbe6857940bf3cce7aae937a6e07e7a281039950" gracePeriod=30 Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.211644 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="sg-core" containerID="cri-o://6a3041246c8e5d512003df850eb5d2c71ec5529262db1a5e959844aab4c569bb" gracePeriod=30 Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.544913 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.627683 4740 generic.go:334] "Generic (PLEG): container finished" podID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerID="6a3041246c8e5d512003df850eb5d2c71ec5529262db1a5e959844aab4c569bb" exitCode=2 Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.627746 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerDied","Data":"6a3041246c8e5d512003df850eb5d2c71ec5529262db1a5e959844aab4c569bb"} Jan 05 14:11:49 crc kubenswrapper[4740]: I0105 14:11:49.953518 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 05 14:11:50 crc kubenswrapper[4740]: I0105 14:11:50.070458 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:50 crc kubenswrapper[4740]: I0105 14:11:50.642705 4740 generic.go:334] "Generic (PLEG): container finished" podID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerID="fc80cc70cbaabd6ef7a790b47123635889329a7e49ec24ad3b0c81473aef210f" exitCode=0 Jan 05 14:11:50 crc kubenswrapper[4740]: I0105 14:11:50.642923 4740 generic.go:334] "Generic (PLEG): container finished" podID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerID="365fea8618eecca354e5e4b6e4d273c3fbbc2eb8ab145c780954fcca286112d1" exitCode=0 Jan 05 14:11:50 crc kubenswrapper[4740]: I0105 14:11:50.642959 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerDied","Data":"fc80cc70cbaabd6ef7a790b47123635889329a7e49ec24ad3b0c81473aef210f"} Jan 05 14:11:50 crc kubenswrapper[4740]: I0105 14:11:50.642983 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerDied","Data":"365fea8618eecca354e5e4b6e4d273c3fbbc2eb8ab145c780954fcca286112d1"} Jan 05 14:11:50 crc kubenswrapper[4740]: I0105 14:11:50.644584 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerStarted","Data":"9ae4cce6bba6e901e04c18948752dde5a7167997717b7292e1b23213a9580aae"} Jan 05 14:11:52 crc kubenswrapper[4740]: I0105 14:11:52.715819 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerStarted","Data":"cce08d473f01d1930853d089c569f7b9039ff8e9f5e57543e44aa5ead335d46a"} Jan 05 14:11:52 crc kubenswrapper[4740]: I0105 14:11:52.722601 4740 generic.go:334] "Generic (PLEG): container finished" podID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerID="9f173637b4d1f60b2d4059d9dbe6857940bf3cce7aae937a6e07e7a281039950" exitCode=0 Jan 05 14:11:52 crc kubenswrapper[4740]: I0105 14:11:52.722656 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerDied","Data":"9f173637b4d1f60b2d4059d9dbe6857940bf3cce7aae937a6e07e7a281039950"} Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.055313 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110001 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-combined-ca-bundle\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110140 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-sg-core-conf-yaml\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110225 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-log-httpd\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110324 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-config-data\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110404 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-scripts\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110431 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-run-httpd\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.110567 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96sdd\" (UniqueName: \"kubernetes.io/projected/bc1cd596-cff5-4f6f-9383-78a21bfd139f-kube-api-access-96sdd\") pod \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\" (UID: \"bc1cd596-cff5-4f6f-9383-78a21bfd139f\") " Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.113426 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.115270 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.117273 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-scripts" (OuterVolumeSpecName: "scripts") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.120200 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1cd596-cff5-4f6f-9383-78a21bfd139f-kube-api-access-96sdd" (OuterVolumeSpecName: "kube-api-access-96sdd") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "kube-api-access-96sdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.164393 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.216842 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.216882 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.216891 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.216899 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc1cd596-cff5-4f6f-9383-78a21bfd139f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.216910 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96sdd\" (UniqueName: \"kubernetes.io/projected/bc1cd596-cff5-4f6f-9383-78a21bfd139f-kube-api-access-96sdd\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.233555 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.264772 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-config-data" (OuterVolumeSpecName: "config-data") pod "bc1cd596-cff5-4f6f-9383-78a21bfd139f" (UID: "bc1cd596-cff5-4f6f-9383-78a21bfd139f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.321446 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.321481 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc1cd596-cff5-4f6f-9383-78a21bfd139f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.752272 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc1cd596-cff5-4f6f-9383-78a21bfd139f","Type":"ContainerDied","Data":"94d364921ae859657f2214c898ec662c532a83578ee58f8ce9202889c24f7ffc"} Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.752634 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.752641 4740 scope.go:117] "RemoveContainer" containerID="fc80cc70cbaabd6ef7a790b47123635889329a7e49ec24ad3b0c81473aef210f" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.812234 4740 scope.go:117] "RemoveContainer" containerID="6a3041246c8e5d512003df850eb5d2c71ec5529262db1a5e959844aab4c569bb" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.815130 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.837009 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852117 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:53 crc kubenswrapper[4740]: E0105 14:11:53.852643 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-central-agent" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852660 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-central-agent" Jan 05 14:11:53 crc kubenswrapper[4740]: E0105 14:11:53.852692 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-notification-agent" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852698 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-notification-agent" Jan 05 14:11:53 crc kubenswrapper[4740]: E0105 14:11:53.852706 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="sg-core" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852712 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="sg-core" Jan 05 14:11:53 crc kubenswrapper[4740]: E0105 14:11:53.852728 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="proxy-httpd" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852734 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="proxy-httpd" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852962 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-notification-agent" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.852986 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="sg-core" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.853000 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="ceilometer-central-agent" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.853020 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" containerName="proxy-httpd" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.856637 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.864458 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.865524 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.865620 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.884411 4740 scope.go:117] "RemoveContainer" containerID="9f173637b4d1f60b2d4059d9dbe6857940bf3cce7aae937a6e07e7a281039950" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934458 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934523 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krblv\" (UniqueName: \"kubernetes.io/projected/ed9a6303-21dd-4900-94c9-5849989cecc2-kube-api-access-krblv\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934574 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-config-data\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934625 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-log-httpd\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934697 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-run-httpd\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.934786 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-scripts\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:53 crc kubenswrapper[4740]: I0105 14:11:53.951403 4740 scope.go:117] "RemoveContainer" containerID="365fea8618eecca354e5e4b6e4d273c3fbbc2eb8ab145c780954fcca286112d1" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.037180 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.037502 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-run-httpd\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.037706 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-scripts\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.038019 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-run-httpd\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.038311 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.038529 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krblv\" (UniqueName: \"kubernetes.io/projected/ed9a6303-21dd-4900-94c9-5849989cecc2-kube-api-access-krblv\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.038766 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-config-data\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.038963 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-log-httpd\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.039830 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-log-httpd\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.043919 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.046183 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-scripts\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.046438 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-config-data\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.047188 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.060965 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krblv\" (UniqueName: \"kubernetes.io/projected/ed9a6303-21dd-4900-94c9-5849989cecc2-kube-api-access-krblv\") pod \"ceilometer-0\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.191791 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.194508 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.212509 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.765222 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerStarted","Data":"a7465aac3718aa934ac766c826ddef7fd8ef6f71f0022b16bf7cbd1d8a82ae1e"} Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.765502 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-api" containerID="cri-o://bb39eed9a7b3583dfe9836d671d14f898c313238abffbc81ada6878e3ed529df" gracePeriod=30 Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.765993 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-listener" containerID="cri-o://a7465aac3718aa934ac766c826ddef7fd8ef6f71f0022b16bf7cbd1d8a82ae1e" gracePeriod=30 Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.766041 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-notifier" containerID="cri-o://cce08d473f01d1930853d089c569f7b9039ff8e9f5e57543e44aa5ead335d46a" gracePeriod=30 Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.766090 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-evaluator" containerID="cri-o://9ae4cce6bba6e901e04c18948752dde5a7167997717b7292e1b23213a9580aae" gracePeriod=30 Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.800758 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.233303403 podStartE2EDuration="8.800734316s" podCreationTimestamp="2026-01-05 14:11:46 +0000 UTC" firstStartedPulling="2026-01-05 14:11:47.396402523 +0000 UTC m=+1356.703311112" lastFinishedPulling="2026-01-05 14:11:53.963833446 +0000 UTC m=+1363.270742025" observedRunningTime="2026-01-05 14:11:54.786887764 +0000 UTC m=+1364.093796343" watchObservedRunningTime="2026-01-05 14:11:54.800734316 +0000 UTC m=+1364.107642895" Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.847756 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:11:54 crc kubenswrapper[4740]: I0105 14:11:54.955520 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.008708 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1cd596-cff5-4f6f-9383-78a21bfd139f" path="/var/lib/kubelet/pods/bc1cd596-cff5-4f6f-9383-78a21bfd139f/volumes" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.010714 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.054154 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.055514 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.071814 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.098493 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.206363 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.206383 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.792416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerStarted","Data":"af64dd467bed4ea10abb54c3d3b2a7b142d3924d7af789586dbfbfd6bc8c302c"} Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.792729 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerStarted","Data":"a7a72b92be5e8736a7542c031be6ab58af6aef1d236d64ed64a3f80306fe070c"} Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.799057 4740 generic.go:334] "Generic (PLEG): container finished" podID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerID="cce08d473f01d1930853d089c569f7b9039ff8e9f5e57543e44aa5ead335d46a" exitCode=0 Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.799110 4740 generic.go:334] "Generic (PLEG): container finished" podID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerID="9ae4cce6bba6e901e04c18948752dde5a7167997717b7292e1b23213a9580aae" exitCode=0 Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.799122 4740 generic.go:334] "Generic (PLEG): container finished" podID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerID="bb39eed9a7b3583dfe9836d671d14f898c313238abffbc81ada6878e3ed529df" exitCode=0 Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.800690 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerDied","Data":"cce08d473f01d1930853d089c569f7b9039ff8e9f5e57543e44aa5ead335d46a"} Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.800731 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerDied","Data":"9ae4cce6bba6e901e04c18948752dde5a7167997717b7292e1b23213a9580aae"} Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.800744 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerDied","Data":"bb39eed9a7b3583dfe9836d671d14f898c313238abffbc81ada6878e3ed529df"} Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.817102 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 05 14:11:55 crc kubenswrapper[4740]: I0105 14:11:55.853235 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.002677 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mwvss"] Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.004488 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.015766 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.015830 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.017576 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwvss"] Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.119796 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-config-data\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.119913 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjm4\" (UniqueName: \"kubernetes.io/projected/37ab0150-0aef-4e09-b54b-ff6201665b33-kube-api-access-dfjm4\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.120041 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-scripts\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.120118 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.140220 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.140504 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.224652 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-scripts\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.224761 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.224874 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-config-data\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.224971 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjm4\" (UniqueName: \"kubernetes.io/projected/37ab0150-0aef-4e09-b54b-ff6201665b33-kube-api-access-dfjm4\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.229793 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.231137 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-scripts\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.235662 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-config-data\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.249340 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjm4\" (UniqueName: \"kubernetes.io/projected/37ab0150-0aef-4e09-b54b-ff6201665b33-kube-api-access-dfjm4\") pod \"nova-cell1-cell-mapping-mwvss\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.329248 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.820305 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerStarted","Data":"4cdb15510da3673012fdf42f963e1c130ed3428097522cad16b3e057f77594dd"} Jan 05 14:11:56 crc kubenswrapper[4740]: I0105 14:11:56.943877 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwvss"] Jan 05 14:11:57 crc kubenswrapper[4740]: I0105 14:11:57.836589 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwvss" event={"ID":"37ab0150-0aef-4e09-b54b-ff6201665b33","Type":"ContainerStarted","Data":"c4f147b5d5b4fddb6922d8baff48cd5c3c767f451bf4b031566a8a1a8c980899"} Jan 05 14:11:57 crc kubenswrapper[4740]: I0105 14:11:57.836956 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwvss" event={"ID":"37ab0150-0aef-4e09-b54b-ff6201665b33","Type":"ContainerStarted","Data":"9739ec279f505609ae287a17b7207dc39b140781b439b0d50015fb18d1a0e7ef"} Jan 05 14:11:57 crc kubenswrapper[4740]: I0105 14:11:57.840326 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerStarted","Data":"c67d685d3faa0c8eb49a2f36e85c9388f73e30b23e13c311f2fddda87c68920c"} Jan 05 14:11:57 crc kubenswrapper[4740]: I0105 14:11:57.857141 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mwvss" podStartSLOduration=2.857124927 podStartE2EDuration="2.857124927s" podCreationTimestamp="2026-01-05 14:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:11:57.85537949 +0000 UTC m=+1367.162288069" watchObservedRunningTime="2026-01-05 14:11:57.857124927 +0000 UTC m=+1367.164033506" Jan 05 14:11:59 crc kubenswrapper[4740]: I0105 14:11:59.871091 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerStarted","Data":"4b48ec021f18fcae3c98cec029e1c2165271baf4fcd14d6530ea172c9856470f"} Jan 05 14:11:59 crc kubenswrapper[4740]: I0105 14:11:59.871726 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:11:59 crc kubenswrapper[4740]: I0105 14:11:59.899246 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.030194342 podStartE2EDuration="6.899219849s" podCreationTimestamp="2026-01-05 14:11:53 +0000 UTC" firstStartedPulling="2026-01-05 14:11:54.861767387 +0000 UTC m=+1364.168675966" lastFinishedPulling="2026-01-05 14:11:58.730792834 +0000 UTC m=+1368.037701473" observedRunningTime="2026-01-05 14:11:59.892959241 +0000 UTC m=+1369.199867840" watchObservedRunningTime="2026-01-05 14:11:59.899219849 +0000 UTC m=+1369.206128438" Jan 05 14:12:02 crc kubenswrapper[4740]: I0105 14:12:02.923782 4740 generic.go:334] "Generic (PLEG): container finished" podID="37ab0150-0aef-4e09-b54b-ff6201665b33" containerID="c4f147b5d5b4fddb6922d8baff48cd5c3c767f451bf4b031566a8a1a8c980899" exitCode=0 Jan 05 14:12:02 crc kubenswrapper[4740]: I0105 14:12:02.923886 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwvss" event={"ID":"37ab0150-0aef-4e09-b54b-ff6201665b33","Type":"ContainerDied","Data":"c4f147b5d5b4fddb6922d8baff48cd5c3c767f451bf4b031566a8a1a8c980899"} Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.198713 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.204456 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.208228 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.423443 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.481379 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-config-data\") pod \"37ab0150-0aef-4e09-b54b-ff6201665b33\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.481457 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-scripts\") pod \"37ab0150-0aef-4e09-b54b-ff6201665b33\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.481486 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-combined-ca-bundle\") pod \"37ab0150-0aef-4e09-b54b-ff6201665b33\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.481566 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfjm4\" (UniqueName: \"kubernetes.io/projected/37ab0150-0aef-4e09-b54b-ff6201665b33-kube-api-access-dfjm4\") pod \"37ab0150-0aef-4e09-b54b-ff6201665b33\" (UID: \"37ab0150-0aef-4e09-b54b-ff6201665b33\") " Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.489149 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ab0150-0aef-4e09-b54b-ff6201665b33-kube-api-access-dfjm4" (OuterVolumeSpecName: "kube-api-access-dfjm4") pod "37ab0150-0aef-4e09-b54b-ff6201665b33" (UID: "37ab0150-0aef-4e09-b54b-ff6201665b33"). InnerVolumeSpecName "kube-api-access-dfjm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.494332 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-scripts" (OuterVolumeSpecName: "scripts") pod "37ab0150-0aef-4e09-b54b-ff6201665b33" (UID: "37ab0150-0aef-4e09-b54b-ff6201665b33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.541135 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37ab0150-0aef-4e09-b54b-ff6201665b33" (UID: "37ab0150-0aef-4e09-b54b-ff6201665b33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.564449 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-config-data" (OuterVolumeSpecName: "config-data") pod "37ab0150-0aef-4e09-b54b-ff6201665b33" (UID: "37ab0150-0aef-4e09-b54b-ff6201665b33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.585086 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.585116 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.585126 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab0150-0aef-4e09-b54b-ff6201665b33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.585135 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfjm4\" (UniqueName: \"kubernetes.io/projected/37ab0150-0aef-4e09-b54b-ff6201665b33-kube-api-access-dfjm4\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.987505 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwvss" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.997667 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwvss" event={"ID":"37ab0150-0aef-4e09-b54b-ff6201665b33","Type":"ContainerDied","Data":"9739ec279f505609ae287a17b7207dc39b140781b439b0d50015fb18d1a0e7ef"} Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.997776 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9739ec279f505609ae287a17b7207dc39b140781b439b0d50015fb18d1a0e7ef" Jan 05 14:12:04 crc kubenswrapper[4740]: I0105 14:12:04.998616 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.062738 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.063350 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.067692 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.080675 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.165721 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.165959 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" containerName="nova-scheduler-scheduler" containerID="cri-o://ef314be233895a93e2ef22c97fbccac10047a6437e5f0c89615cf5aabd1dbc24" gracePeriod=30 Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.181020 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:12:05 crc kubenswrapper[4740]: I0105 14:12:05.203333 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.008911 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.016055 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.221389 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-hhmgg"] Jan 05 14:12:06 crc kubenswrapper[4740]: E0105 14:12:06.222057 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab0150-0aef-4e09-b54b-ff6201665b33" containerName="nova-manage" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.222095 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab0150-0aef-4e09-b54b-ff6201665b33" containerName="nova-manage" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.222424 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab0150-0aef-4e09-b54b-ff6201665b33" containerName="nova-manage" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.224177 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.232416 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-hhmgg"] Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.327081 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.327132 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.327177 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.327243 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dncbf\" (UniqueName: \"kubernetes.io/projected/0fc2c7ec-e277-42bf-b04e-030649d2671a-kube-api-access-dncbf\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.327274 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.327316 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-config\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.429827 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.430225 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.431253 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.431339 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.431488 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.432009 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.432264 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dncbf\" (UniqueName: \"kubernetes.io/projected/0fc2c7ec-e277-42bf-b04e-030649d2671a-kube-api-access-dncbf\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.432711 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.433426 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.433535 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-config\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.434464 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-config\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.451588 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dncbf\" (UniqueName: \"kubernetes.io/projected/0fc2c7ec-e277-42bf-b04e-030649d2671a-kube-api-access-dncbf\") pod \"dnsmasq-dns-f84f9ccf-hhmgg\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:06 crc kubenswrapper[4740]: I0105 14:12:06.548907 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:07 crc kubenswrapper[4740]: I0105 14:12:07.019227 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-log" containerID="cri-o://dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f" gracePeriod=30 Jan 05 14:12:07 crc kubenswrapper[4740]: I0105 14:12:07.019510 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-api" containerID="cri-o://b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823" gracePeriod=30 Jan 05 14:12:07 crc kubenswrapper[4740]: I0105 14:12:07.019400 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-metadata" containerID="cri-o://6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd" gracePeriod=30 Jan 05 14:12:07 crc kubenswrapper[4740]: I0105 14:12:07.019439 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-log" containerID="cri-o://8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43" gracePeriod=30 Jan 05 14:12:07 crc kubenswrapper[4740]: I0105 14:12:07.207570 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-hhmgg"] Jan 05 14:12:07 crc kubenswrapper[4740]: W0105 14:12:07.217015 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fc2c7ec_e277_42bf_b04e_030649d2671a.slice/crio-764bd0d6adc9d599d8e737f9476e0068155bc8d5ed155c13f8398e765d0c400c WatchSource:0}: Error finding container 764bd0d6adc9d599d8e737f9476e0068155bc8d5ed155c13f8398e765d0c400c: Status 404 returned error can't find the container with id 764bd0d6adc9d599d8e737f9476e0068155bc8d5ed155c13f8398e765d0c400c Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.031623 4740 generic.go:334] "Generic (PLEG): container finished" podID="62f347cb-9580-4d54-adef-f3a06d466739" containerID="dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f" exitCode=143 Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.033173 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62f347cb-9580-4d54-adef-f3a06d466739","Type":"ContainerDied","Data":"dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f"} Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.035587 4740 generic.go:334] "Generic (PLEG): container finished" podID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerID="8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43" exitCode=143 Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.035732 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ae73796-c5c6-47f2-a40c-481842c4dbdd","Type":"ContainerDied","Data":"8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43"} Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.037613 4740 generic.go:334] "Generic (PLEG): container finished" podID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerID="10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586" exitCode=0 Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.037729 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" event={"ID":"0fc2c7ec-e277-42bf-b04e-030649d2671a","Type":"ContainerDied","Data":"10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586"} Jan 05 14:12:08 crc kubenswrapper[4740]: I0105 14:12:08.037866 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" event={"ID":"0fc2c7ec-e277-42bf-b04e-030649d2671a","Type":"ContainerStarted","Data":"764bd0d6adc9d599d8e737f9476e0068155bc8d5ed155c13f8398e765d0c400c"} Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.036925 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.038593 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-central-agent" containerID="cri-o://af64dd467bed4ea10abb54c3d3b2a7b142d3924d7af789586dbfbfd6bc8c302c" gracePeriod=30 Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.039776 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="proxy-httpd" containerID="cri-o://4b48ec021f18fcae3c98cec029e1c2165271baf4fcd14d6530ea172c9856470f" gracePeriod=30 Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.041583 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-notification-agent" containerID="cri-o://4cdb15510da3673012fdf42f963e1c130ed3428097522cad16b3e057f77594dd" gracePeriod=30 Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.041759 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="sg-core" containerID="cri-o://c67d685d3faa0c8eb49a2f36e85c9388f73e30b23e13c311f2fddda87c68920c" gracePeriod=30 Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.068256 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.3:3000/\": EOF" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.070335 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" event={"ID":"0fc2c7ec-e277-42bf-b04e-030649d2671a","Type":"ContainerStarted","Data":"83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b"} Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.075513 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.084759 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" containerID="ef314be233895a93e2ef22c97fbccac10047a6437e5f0c89615cf5aabd1dbc24" exitCode=0 Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.084808 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4","Type":"ContainerDied","Data":"ef314be233895a93e2ef22c97fbccac10047a6437e5f0c89615cf5aabd1dbc24"} Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.101164 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" podStartSLOduration=3.101145405 podStartE2EDuration="3.101145405s" podCreationTimestamp="2026-01-05 14:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:12:09.095721439 +0000 UTC m=+1378.402630018" watchObservedRunningTime="2026-01-05 14:12:09.101145405 +0000 UTC m=+1378.408053984" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.489167 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.667832 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-combined-ca-bundle\") pod \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.668246 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-config-data\") pod \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.668345 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqd28\" (UniqueName: \"kubernetes.io/projected/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-kube-api-access-sqd28\") pod \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\" (UID: \"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4\") " Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.673251 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-kube-api-access-sqd28" (OuterVolumeSpecName: "kube-api-access-sqd28") pod "4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" (UID: "4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4"). InnerVolumeSpecName "kube-api-access-sqd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.707131 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-config-data" (OuterVolumeSpecName: "config-data") pod "4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" (UID: "4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.730384 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" (UID: "4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.771079 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.771134 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:09 crc kubenswrapper[4740]: I0105 14:12:09.771145 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqd28\" (UniqueName: \"kubernetes.io/projected/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4-kube-api-access-sqd28\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.100193 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerID="4b48ec021f18fcae3c98cec029e1c2165271baf4fcd14d6530ea172c9856470f" exitCode=0 Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.100223 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerID="c67d685d3faa0c8eb49a2f36e85c9388f73e30b23e13c311f2fddda87c68920c" exitCode=2 Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.100231 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerID="af64dd467bed4ea10abb54c3d3b2a7b142d3924d7af789586dbfbfd6bc8c302c" exitCode=0 Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.100277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerDied","Data":"4b48ec021f18fcae3c98cec029e1c2165271baf4fcd14d6530ea172c9856470f"} Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.100323 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerDied","Data":"c67d685d3faa0c8eb49a2f36e85c9388f73e30b23e13c311f2fddda87c68920c"} Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.100338 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerDied","Data":"af64dd467bed4ea10abb54c3d3b2a7b142d3924d7af789586dbfbfd6bc8c302c"} Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.102129 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4","Type":"ContainerDied","Data":"27eb9bd3883d92842d75c123b8b8b3b8022278b00c3d0e65c898150b3f497654"} Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.102178 4740 scope.go:117] "RemoveContainer" containerID="ef314be233895a93e2ef22c97fbccac10047a6437e5f0c89615cf5aabd1dbc24" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.102142 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.138845 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.151386 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.163050 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:12:10 crc kubenswrapper[4740]: E0105 14:12:10.163642 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" containerName="nova-scheduler-scheduler" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.163660 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" containerName="nova-scheduler-scheduler" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.163891 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" containerName="nova-scheduler-scheduler" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.164748 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.187627 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.208049 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.289719 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043aa2af-a684-451e-a812-cf42bd753490-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.289897 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043aa2af-a684-451e-a812-cf42bd753490-config-data\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.290326 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r9bh\" (UniqueName: \"kubernetes.io/projected/043aa2af-a684-451e-a812-cf42bd753490-kube-api-access-2r9bh\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.392391 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r9bh\" (UniqueName: \"kubernetes.io/projected/043aa2af-a684-451e-a812-cf42bd753490-kube-api-access-2r9bh\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.392754 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043aa2af-a684-451e-a812-cf42bd753490-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.392840 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043aa2af-a684-451e-a812-cf42bd753490-config-data\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.398898 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043aa2af-a684-451e-a812-cf42bd753490-config-data\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.401788 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043aa2af-a684-451e-a812-cf42bd753490-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.415337 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r9bh\" (UniqueName: \"kubernetes.io/projected/043aa2af-a684-451e-a812-cf42bd753490-kube-api-access-2r9bh\") pod \"nova-scheduler-0\" (UID: \"043aa2af-a684-451e-a812-cf42bd753490\") " pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.525785 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": read tcp 10.217.0.2:38418->10.217.0.254:8775: read: connection reset by peer" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.526375 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": read tcp 10.217.0.2:38414->10.217.0.254:8775: read: connection reset by peer" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.543862 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.767940 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.908009 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-combined-ca-bundle\") pod \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.908077 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae73796-c5c6-47f2-a40c-481842c4dbdd-logs\") pod \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.908334 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbxmr\" (UniqueName: \"kubernetes.io/projected/8ae73796-c5c6-47f2-a40c-481842c4dbdd-kube-api-access-dbxmr\") pod \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.908407 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-config-data\") pod \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\" (UID: \"8ae73796-c5c6-47f2-a40c-481842c4dbdd\") " Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.908705 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ae73796-c5c6-47f2-a40c-481842c4dbdd-logs" (OuterVolumeSpecName: "logs") pod "8ae73796-c5c6-47f2-a40c-481842c4dbdd" (UID: "8ae73796-c5c6-47f2-a40c-481842c4dbdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.909102 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ae73796-c5c6-47f2-a40c-481842c4dbdd-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.913472 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae73796-c5c6-47f2-a40c-481842c4dbdd-kube-api-access-dbxmr" (OuterVolumeSpecName: "kube-api-access-dbxmr") pod "8ae73796-c5c6-47f2-a40c-481842c4dbdd" (UID: "8ae73796-c5c6-47f2-a40c-481842c4dbdd"). InnerVolumeSpecName "kube-api-access-dbxmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.968299 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-config-data" (OuterVolumeSpecName: "config-data") pod "8ae73796-c5c6-47f2-a40c-481842c4dbdd" (UID: "8ae73796-c5c6-47f2-a40c-481842c4dbdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.968530 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ae73796-c5c6-47f2-a40c-481842c4dbdd" (UID: "8ae73796-c5c6-47f2-a40c-481842c4dbdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:10 crc kubenswrapper[4740]: I0105 14:12:10.990866 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4" path="/var/lib/kubelet/pods/4e33bb1c-f72d-4c4b-a7c2-76889d62e9e4/volumes" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.010883 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.010912 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbxmr\" (UniqueName: \"kubernetes.io/projected/8ae73796-c5c6-47f2-a40c-481842c4dbdd-kube-api-access-dbxmr\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.010923 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ae73796-c5c6-47f2-a40c-481842c4dbdd-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.069884 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.129275 4740 generic.go:334] "Generic (PLEG): container finished" podID="62f347cb-9580-4d54-adef-f3a06d466739" containerID="6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd" exitCode=0 Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.129448 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62f347cb-9580-4d54-adef-f3a06d466739","Type":"ContainerDied","Data":"6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd"} Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.129475 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62f347cb-9580-4d54-adef-f3a06d466739","Type":"ContainerDied","Data":"8c9e0ab460a502fb39663c4c8a9f0a3ea9e9456e74802c62c596c4a9aaccd270"} Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.129493 4740 scope.go:117] "RemoveContainer" containerID="6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.129613 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.136356 4740 generic.go:334] "Generic (PLEG): container finished" podID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerID="b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823" exitCode=0 Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.136390 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ae73796-c5c6-47f2-a40c-481842c4dbdd","Type":"ContainerDied","Data":"b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823"} Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.136412 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ae73796-c5c6-47f2-a40c-481842c4dbdd","Type":"ContainerDied","Data":"57ed3793a6de1ae51b78de2b64565c03b5d50608e7ec8a7d5c2c498137c2d52c"} Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.136459 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.163217 4740 scope.go:117] "RemoveContainer" containerID="dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.184050 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.216085 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f347cb-9580-4d54-adef-f3a06d466739-logs\") pod \"62f347cb-9580-4d54-adef-f3a06d466739\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.216281 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9wpw\" (UniqueName: \"kubernetes.io/projected/62f347cb-9580-4d54-adef-f3a06d466739-kube-api-access-v9wpw\") pod \"62f347cb-9580-4d54-adef-f3a06d466739\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.216318 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-config-data\") pod \"62f347cb-9580-4d54-adef-f3a06d466739\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.216399 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-nova-metadata-tls-certs\") pod \"62f347cb-9580-4d54-adef-f3a06d466739\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.216439 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-combined-ca-bundle\") pod \"62f347cb-9580-4d54-adef-f3a06d466739\" (UID: \"62f347cb-9580-4d54-adef-f3a06d466739\") " Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.225359 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f347cb-9580-4d54-adef-f3a06d466739-logs" (OuterVolumeSpecName: "logs") pod "62f347cb-9580-4d54-adef-f3a06d466739" (UID: "62f347cb-9580-4d54-adef-f3a06d466739"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.244199 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f347cb-9580-4d54-adef-f3a06d466739-kube-api-access-v9wpw" (OuterVolumeSpecName: "kube-api-access-v9wpw") pod "62f347cb-9580-4d54-adef-f3a06d466739" (UID: "62f347cb-9580-4d54-adef-f3a06d466739"). InnerVolumeSpecName "kube-api-access-v9wpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.272737 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.289231 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62f347cb-9580-4d54-adef-f3a06d466739" (UID: "62f347cb-9580-4d54-adef-f3a06d466739"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.293244 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae73796_c5c6_47f2_a40c_481842c4dbdd.slice\": RecentStats: unable to find data in memory cache]" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.295909 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-config-data" (OuterVolumeSpecName: "config-data") pod "62f347cb-9580-4d54-adef-f3a06d466739" (UID: "62f347cb-9580-4d54-adef-f3a06d466739"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.298407 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.298972 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-api" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.298993 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-api" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.299033 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-metadata" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299040 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-metadata" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.299295 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-log" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299312 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-log" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.299328 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-log" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299334 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-log" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299585 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-api" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299597 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" containerName="nova-api-log" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299623 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-log" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.299638 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f347cb-9580-4d54-adef-f3a06d466739" containerName="nova-metadata-metadata" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.301038 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.305428 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.305706 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.308164 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.321533 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.321586 4740 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62f347cb-9580-4d54-adef-f3a06d466739-logs\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.321598 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9wpw\" (UniqueName: \"kubernetes.io/projected/62f347cb-9580-4d54-adef-f3a06d466739-kube-api-access-v9wpw\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.321608 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.323196 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.362198 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.393840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "62f347cb-9580-4d54-adef-f3a06d466739" (UID: "62f347cb-9580-4d54-adef-f3a06d466739"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.423358 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-public-tls-certs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.423400 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.423440 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7dcj\" (UniqueName: \"kubernetes.io/projected/03e6abd6-0882-4930-90c6-d2ba9ded2d49-kube-api-access-x7dcj\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.423702 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-config-data\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.423924 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.424060 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03e6abd6-0882-4930-90c6-d2ba9ded2d49-logs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.424260 4740 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62f347cb-9580-4d54-adef-f3a06d466739-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.495796 4740 scope.go:117] "RemoveContainer" containerID="6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.497475 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd\": container with ID starting with 6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd not found: ID does not exist" containerID="6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.497526 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd"} err="failed to get container status \"6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd\": rpc error: code = NotFound desc = could not find container \"6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd\": container with ID starting with 6e808a165c4431fe28715b004fe16a60a04dba3239e57db49e807869fb6e6ecd not found: ID does not exist" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.497556 4740 scope.go:117] "RemoveContainer" containerID="dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.504524 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f\": container with ID starting with dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f not found: ID does not exist" containerID="dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.504583 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f"} err="failed to get container status \"dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f\": rpc error: code = NotFound desc = could not find container \"dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f\": container with ID starting with dc017291bd85a61bc42bdcdf132cfd29d1879b8864820f91a5eb516ddfaba35f not found: ID does not exist" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.504610 4740 scope.go:117] "RemoveContainer" containerID="b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.517991 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.540823 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-config-data\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.540939 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.540996 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03e6abd6-0882-4930-90c6-d2ba9ded2d49-logs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.541100 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-public-tls-certs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.541123 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.541163 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7dcj\" (UniqueName: \"kubernetes.io/projected/03e6abd6-0882-4930-90c6-d2ba9ded2d49-kube-api-access-x7dcj\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.541646 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03e6abd6-0882-4930-90c6-d2ba9ded2d49-logs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.546338 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-config-data\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.546411 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.547283 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.558315 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.562201 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.563896 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7dcj\" (UniqueName: \"kubernetes.io/projected/03e6abd6-0882-4930-90c6-d2ba9ded2d49-kube-api-access-x7dcj\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.564205 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.565608 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.566597 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.570647 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.572602 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03e6abd6-0882-4930-90c6-d2ba9ded2d49-public-tls-certs\") pod \"nova-api-0\" (UID: \"03e6abd6-0882-4930-90c6-d2ba9ded2d49\") " pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.586785 4740 scope.go:117] "RemoveContainer" containerID="8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.614297 4740 scope.go:117] "RemoveContainer" containerID="b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.614708 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823\": container with ID starting with b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823 not found: ID does not exist" containerID="b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.614757 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823"} err="failed to get container status \"b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823\": rpc error: code = NotFound desc = could not find container \"b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823\": container with ID starting with b55047bf2933e008f8f6d1470e6e75e2cb6626aaff17689b991270fa46a53823 not found: ID does not exist" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.614793 4740 scope.go:117] "RemoveContainer" containerID="8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43" Jan 05 14:12:11 crc kubenswrapper[4740]: E0105 14:12:11.615001 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43\": container with ID starting with 8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43 not found: ID does not exist" containerID="8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.615021 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43"} err="failed to get container status \"8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43\": rpc error: code = NotFound desc = could not find container \"8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43\": container with ID starting with 8ab689470a057eec541d8a06d8f58de9d2bc3a56d97455f86e54400484c75e43 not found: ID does not exist" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.644427 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-config-data\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.644567 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54skb\" (UniqueName: \"kubernetes.io/projected/bd6f1475-5188-4831-91b3-e6741b308a2e-kube-api-access-54skb\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.645086 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.645566 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd6f1475-5188-4831-91b3-e6741b308a2e-logs\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.645670 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.748090 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd6f1475-5188-4831-91b3-e6741b308a2e-logs\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.748161 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.748230 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-config-data\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.748275 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54skb\" (UniqueName: \"kubernetes.io/projected/bd6f1475-5188-4831-91b3-e6741b308a2e-kube-api-access-54skb\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.748374 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.749017 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd6f1475-5188-4831-91b3-e6741b308a2e-logs\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.751450 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.752121 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.752916 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6f1475-5188-4831-91b3-e6741b308a2e-config-data\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.763862 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54skb\" (UniqueName: \"kubernetes.io/projected/bd6f1475-5188-4831-91b3-e6741b308a2e-kube-api-access-54skb\") pod \"nova-metadata-0\" (UID: \"bd6f1475-5188-4831-91b3-e6741b308a2e\") " pod="openstack/nova-metadata-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.807333 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 05 14:12:11 crc kubenswrapper[4740]: I0105 14:12:11.899050 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.152739 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"043aa2af-a684-451e-a812-cf42bd753490","Type":"ContainerStarted","Data":"aa1ff976970ae762b0d62d62335cb256e81be1499f739fd841a01bf84e822b48"} Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.153103 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"043aa2af-a684-451e-a812-cf42bd753490","Type":"ContainerStarted","Data":"6c1bd1cd760861f08e36aa3ddcd6e0618f1d039eef91d6769a6d2f2b9a34c06a"} Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.182277 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.182252161 podStartE2EDuration="2.182252161s" podCreationTimestamp="2026-01-05 14:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:12:12.169918989 +0000 UTC m=+1381.476827588" watchObservedRunningTime="2026-01-05 14:12:12.182252161 +0000 UTC m=+1381.489160740" Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.300301 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 05 14:12:12 crc kubenswrapper[4740]: W0105 14:12:12.301345 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03e6abd6_0882_4930_90c6_d2ba9ded2d49.slice/crio-34a903f7064b94674efeb748f018790cdc13f181ff5cc774c6ff6e4cc2e974c2 WatchSource:0}: Error finding container 34a903f7064b94674efeb748f018790cdc13f181ff5cc774c6ff6e4cc2e974c2: Status 404 returned error can't find the container with id 34a903f7064b94674efeb748f018790cdc13f181ff5cc774c6ff6e4cc2e974c2 Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.493976 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.983501 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f347cb-9580-4d54-adef-f3a06d466739" path="/var/lib/kubelet/pods/62f347cb-9580-4d54-adef-f3a06d466739/volumes" Jan 05 14:12:12 crc kubenswrapper[4740]: I0105 14:12:12.984387 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae73796-c5c6-47f2-a40c-481842c4dbdd" path="/var/lib/kubelet/pods/8ae73796-c5c6-47f2-a40c-481842c4dbdd/volumes" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.171892 4740 generic.go:334] "Generic (PLEG): container finished" podID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerID="4cdb15510da3673012fdf42f963e1c130ed3428097522cad16b3e057f77594dd" exitCode=0 Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.171940 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerDied","Data":"4cdb15510da3673012fdf42f963e1c130ed3428097522cad16b3e057f77594dd"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.175496 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd6f1475-5188-4831-91b3-e6741b308a2e","Type":"ContainerStarted","Data":"7e912386f0307acb03a1fb72395ad86a7e34d3a25e183ab2cc506aafc3c15b55"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.175532 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd6f1475-5188-4831-91b3-e6741b308a2e","Type":"ContainerStarted","Data":"73f421d9e9fb4f8bbc4c8bff06ecbb2c88edbe77af480a64802542cdde4d0f9d"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.175541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd6f1475-5188-4831-91b3-e6741b308a2e","Type":"ContainerStarted","Data":"61cace9434cbcec2d97546aec0468bffa22cbe1032885abf549dac342f335b10"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.178251 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03e6abd6-0882-4930-90c6-d2ba9ded2d49","Type":"ContainerStarted","Data":"5adfb87e61536f584250a74204834f4b640fb158a921172dff117a08e80b99aa"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.178291 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03e6abd6-0882-4930-90c6-d2ba9ded2d49","Type":"ContainerStarted","Data":"ec1a7c9dd495dbc00ba633f11ba8b35c8c6515b749a202d765ace0690f38afa0"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.178306 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03e6abd6-0882-4930-90c6-d2ba9ded2d49","Type":"ContainerStarted","Data":"34a903f7064b94674efeb748f018790cdc13f181ff5cc774c6ff6e4cc2e974c2"} Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.215862 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.21584207 podStartE2EDuration="2.21584207s" podCreationTimestamp="2026-01-05 14:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:12:13.208213014 +0000 UTC m=+1382.515121603" watchObservedRunningTime="2026-01-05 14:12:13.21584207 +0000 UTC m=+1382.522750649" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.240406 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.240383859 podStartE2EDuration="2.240383859s" podCreationTimestamp="2026-01-05 14:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:12:13.22998874 +0000 UTC m=+1382.536897349" watchObservedRunningTime="2026-01-05 14:12:13.240383859 +0000 UTC m=+1382.547292438" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.499975 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.606956 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-sg-core-conf-yaml\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607430 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-log-httpd\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607468 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-scripts\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607512 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-combined-ca-bundle\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607582 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-config-data\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607608 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-run-httpd\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607693 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krblv\" (UniqueName: \"kubernetes.io/projected/ed9a6303-21dd-4900-94c9-5849989cecc2-kube-api-access-krblv\") pod \"ed9a6303-21dd-4900-94c9-5849989cecc2\" (UID: \"ed9a6303-21dd-4900-94c9-5849989cecc2\") " Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.607984 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.608210 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.609050 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.609110 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed9a6303-21dd-4900-94c9-5849989cecc2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.613049 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed9a6303-21dd-4900-94c9-5849989cecc2-kube-api-access-krblv" (OuterVolumeSpecName: "kube-api-access-krblv") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "kube-api-access-krblv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.613129 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-scripts" (OuterVolumeSpecName: "scripts") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.699958 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.711260 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krblv\" (UniqueName: \"kubernetes.io/projected/ed9a6303-21dd-4900-94c9-5849989cecc2-kube-api-access-krblv\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.711290 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.711301 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.729862 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.779550 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-config-data" (OuterVolumeSpecName: "config-data") pod "ed9a6303-21dd-4900-94c9-5849989cecc2" (UID: "ed9a6303-21dd-4900-94c9-5849989cecc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.813340 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:13 crc kubenswrapper[4740]: I0105 14:12:13.813378 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed9a6303-21dd-4900-94c9-5849989cecc2-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.191334 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.191953 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed9a6303-21dd-4900-94c9-5849989cecc2","Type":"ContainerDied","Data":"a7a72b92be5e8736a7542c031be6ab58af6aef1d236d64ed64a3f80306fe070c"} Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.192043 4740 scope.go:117] "RemoveContainer" containerID="4b48ec021f18fcae3c98cec029e1c2165271baf4fcd14d6530ea172c9856470f" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.293350 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.298206 4740 scope.go:117] "RemoveContainer" containerID="c67d685d3faa0c8eb49a2f36e85c9388f73e30b23e13c311f2fddda87c68920c" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.312527 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.324370 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:14 crc kubenswrapper[4740]: E0105 14:12:14.324838 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-central-agent" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.324855 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-central-agent" Jan 05 14:12:14 crc kubenswrapper[4740]: E0105 14:12:14.324883 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="proxy-httpd" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.324891 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="proxy-httpd" Jan 05 14:12:14 crc kubenswrapper[4740]: E0105 14:12:14.324906 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-notification-agent" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.324915 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-notification-agent" Jan 05 14:12:14 crc kubenswrapper[4740]: E0105 14:12:14.324934 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="sg-core" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.324942 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="sg-core" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.325242 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="sg-core" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.325270 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-central-agent" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.325282 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="proxy-httpd" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.325294 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" containerName="ceilometer-notification-agent" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.331097 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.333674 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.334695 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.338039 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.338984 4740 scope.go:117] "RemoveContainer" containerID="4cdb15510da3673012fdf42f963e1c130ed3428097522cad16b3e057f77594dd" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.380914 4740 scope.go:117] "RemoveContainer" containerID="af64dd467bed4ea10abb54c3d3b2a7b142d3924d7af789586dbfbfd6bc8c302c" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.436219 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-run-httpd\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.436281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.436520 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnl9\" (UniqueName: \"kubernetes.io/projected/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-kube-api-access-mgnl9\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.436798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-log-httpd\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.436881 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-scripts\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.436941 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-config-data\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.437049 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.540504 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnl9\" (UniqueName: \"kubernetes.io/projected/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-kube-api-access-mgnl9\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.540724 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-log-httpd\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.540822 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-scripts\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.540949 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-config-data\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.541099 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.541232 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-log-httpd\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.541320 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-run-httpd\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.541428 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.541699 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-run-httpd\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.545665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.546228 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.547716 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-config-data\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.548091 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-scripts\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.558295 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnl9\" (UniqueName: \"kubernetes.io/projected/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-kube-api-access-mgnl9\") pod \"ceilometer-0\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.649849 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:14 crc kubenswrapper[4740]: I0105 14:12:14.981428 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed9a6303-21dd-4900-94c9-5849989cecc2" path="/var/lib/kubelet/pods/ed9a6303-21dd-4900-94c9-5849989cecc2/volumes" Jan 05 14:12:15 crc kubenswrapper[4740]: I0105 14:12:15.195016 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:15 crc kubenswrapper[4740]: I0105 14:12:15.214358 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerStarted","Data":"f4f84c5c0207eb0e7b773789bd867072f97d162d913663f89ab61a1d967efd08"} Jan 05 14:12:15 crc kubenswrapper[4740]: I0105 14:12:15.545327 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 05 14:12:16 crc kubenswrapper[4740]: I0105 14:12:16.247761 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerStarted","Data":"e1e9b1fce0283d30a03b8f07252671b21169cb2d8c83348559c49139fb8c4386"} Jan 05 14:12:16 crc kubenswrapper[4740]: I0105 14:12:16.551898 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:12:16 crc kubenswrapper[4740]: I0105 14:12:16.643089 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-jqjqm"] Jan 05 14:12:16 crc kubenswrapper[4740]: I0105 14:12:16.643307 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerName="dnsmasq-dns" containerID="cri-o://b26e21512e236401b079a1d8112670b3253bcf4216720cc81e8ff27ebe6cb1de" gracePeriod=10 Jan 05 14:12:16 crc kubenswrapper[4740]: I0105 14:12:16.899359 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 05 14:12:16 crc kubenswrapper[4740]: I0105 14:12:16.899688 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.271995 4740 generic.go:334] "Generic (PLEG): container finished" podID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerID="b26e21512e236401b079a1d8112670b3253bcf4216720cc81e8ff27ebe6cb1de" exitCode=0 Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.272163 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" event={"ID":"69df31f0-004d-41d9-8c4c-c0bc865ff354","Type":"ContainerDied","Data":"b26e21512e236401b079a1d8112670b3253bcf4216720cc81e8ff27ebe6cb1de"} Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.272801 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" event={"ID":"69df31f0-004d-41d9-8c4c-c0bc865ff354","Type":"ContainerDied","Data":"ef314bcd6be6683cb1c18c4f6c2e8d69e561b18350d2d2c4626c9e7dbef901d7"} Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.272834 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef314bcd6be6683cb1c18c4f6c2e8d69e561b18350d2d2c4626c9e7dbef901d7" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.274789 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerStarted","Data":"681acffd14d38ee97ffaa04619d8ca2e02a1a221f2a07acb2863bbe1285ad428"} Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.297138 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.445858 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-svc\") pod \"69df31f0-004d-41d9-8c4c-c0bc865ff354\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.445938 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-swift-storage-0\") pod \"69df31f0-004d-41d9-8c4c-c0bc865ff354\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.446136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-sb\") pod \"69df31f0-004d-41d9-8c4c-c0bc865ff354\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.446210 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl6kl\" (UniqueName: \"kubernetes.io/projected/69df31f0-004d-41d9-8c4c-c0bc865ff354-kube-api-access-hl6kl\") pod \"69df31f0-004d-41d9-8c4c-c0bc865ff354\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.446258 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-nb\") pod \"69df31f0-004d-41d9-8c4c-c0bc865ff354\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.446398 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-config\") pod \"69df31f0-004d-41d9-8c4c-c0bc865ff354\" (UID: \"69df31f0-004d-41d9-8c4c-c0bc865ff354\") " Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.454810 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69df31f0-004d-41d9-8c4c-c0bc865ff354-kube-api-access-hl6kl" (OuterVolumeSpecName: "kube-api-access-hl6kl") pod "69df31f0-004d-41d9-8c4c-c0bc865ff354" (UID: "69df31f0-004d-41d9-8c4c-c0bc865ff354"). InnerVolumeSpecName "kube-api-access-hl6kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.529661 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-config" (OuterVolumeSpecName: "config") pod "69df31f0-004d-41d9-8c4c-c0bc865ff354" (UID: "69df31f0-004d-41d9-8c4c-c0bc865ff354"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.535321 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69df31f0-004d-41d9-8c4c-c0bc865ff354" (UID: "69df31f0-004d-41d9-8c4c-c0bc865ff354"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.540539 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69df31f0-004d-41d9-8c4c-c0bc865ff354" (UID: "69df31f0-004d-41d9-8c4c-c0bc865ff354"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.550493 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.550538 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.550554 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl6kl\" (UniqueName: \"kubernetes.io/projected/69df31f0-004d-41d9-8c4c-c0bc865ff354-kube-api-access-hl6kl\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.550565 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.556005 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69df31f0-004d-41d9-8c4c-c0bc865ff354" (UID: "69df31f0-004d-41d9-8c4c-c0bc865ff354"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.556820 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69df31f0-004d-41d9-8c4c-c0bc865ff354" (UID: "69df31f0-004d-41d9-8c4c-c0bc865ff354"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.654010 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:18 crc kubenswrapper[4740]: I0105 14:12:18.654051 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69df31f0-004d-41d9-8c4c-c0bc865ff354-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:19 crc kubenswrapper[4740]: I0105 14:12:19.287963 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-jqjqm" Jan 05 14:12:19 crc kubenswrapper[4740]: I0105 14:12:19.288200 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerStarted","Data":"81006f9e448378ee3c82b701651a413f1f481b235662dcd15a30d0ef724a6fef"} Jan 05 14:12:19 crc kubenswrapper[4740]: I0105 14:12:19.315234 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-jqjqm"] Jan 05 14:12:19 crc kubenswrapper[4740]: I0105 14:12:19.325678 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-jqjqm"] Jan 05 14:12:20 crc kubenswrapper[4740]: I0105 14:12:20.545400 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 05 14:12:20 crc kubenswrapper[4740]: I0105 14:12:20.576020 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 05 14:12:20 crc kubenswrapper[4740]: I0105 14:12:20.983508 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" path="/var/lib/kubelet/pods/69df31f0-004d-41d9-8c4c-c0bc865ff354/volumes" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.326868 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerStarted","Data":"1da9c538825a5a7615d10f0d492f1312807aac4394eabab3b468d19fc7303f25"} Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.327839 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.350875 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.44505051 podStartE2EDuration="7.350858061s" podCreationTimestamp="2026-01-05 14:12:14 +0000 UTC" firstStartedPulling="2026-01-05 14:12:15.194889476 +0000 UTC m=+1384.501798065" lastFinishedPulling="2026-01-05 14:12:20.100697047 +0000 UTC m=+1389.407605616" observedRunningTime="2026-01-05 14:12:21.347948843 +0000 UTC m=+1390.654857422" watchObservedRunningTime="2026-01-05 14:12:21.350858061 +0000 UTC m=+1390.657766640" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.380857 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.807959 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.808743 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.900414 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 05 14:12:21 crc kubenswrapper[4740]: I0105 14:12:21.900793 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 05 14:12:22 crc kubenswrapper[4740]: I0105 14:12:22.823257 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="03e6abd6-0882-4930-90c6-d2ba9ded2d49" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.7:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 14:12:22 crc kubenswrapper[4740]: I0105 14:12:22.823417 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="03e6abd6-0882-4930-90c6-d2ba9ded2d49" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.7:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 14:12:22 crc kubenswrapper[4740]: I0105 14:12:22.922253 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd6f1475-5188-4831-91b3-e6741b308a2e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.8:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 14:12:22 crc kubenswrapper[4740]: I0105 14:12:22.922271 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd6f1475-5188-4831-91b3-e6741b308a2e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.8:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.404463 4740 generic.go:334] "Generic (PLEG): container finished" podID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerID="a7465aac3718aa934ac766c826ddef7fd8ef6f71f0022b16bf7cbd1d8a82ae1e" exitCode=137 Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.405980 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerDied","Data":"a7465aac3718aa934ac766c826ddef7fd8ef6f71f0022b16bf7cbd1d8a82ae1e"} Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.544418 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.682936 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-scripts\") pod \"b9ea403a-5716-4424-bfeb-61537e7c3969\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.683169 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srrqv\" (UniqueName: \"kubernetes.io/projected/b9ea403a-5716-4424-bfeb-61537e7c3969-kube-api-access-srrqv\") pod \"b9ea403a-5716-4424-bfeb-61537e7c3969\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.683951 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-combined-ca-bundle\") pod \"b9ea403a-5716-4424-bfeb-61537e7c3969\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.684171 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-config-data\") pod \"b9ea403a-5716-4424-bfeb-61537e7c3969\" (UID: \"b9ea403a-5716-4424-bfeb-61537e7c3969\") " Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.688769 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ea403a-5716-4424-bfeb-61537e7c3969-kube-api-access-srrqv" (OuterVolumeSpecName: "kube-api-access-srrqv") pod "b9ea403a-5716-4424-bfeb-61537e7c3969" (UID: "b9ea403a-5716-4424-bfeb-61537e7c3969"). InnerVolumeSpecName "kube-api-access-srrqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.689742 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-scripts" (OuterVolumeSpecName: "scripts") pod "b9ea403a-5716-4424-bfeb-61537e7c3969" (UID: "b9ea403a-5716-4424-bfeb-61537e7c3969"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.787439 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.787467 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srrqv\" (UniqueName: \"kubernetes.io/projected/b9ea403a-5716-4424-bfeb-61537e7c3969-kube-api-access-srrqv\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.819694 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9ea403a-5716-4424-bfeb-61537e7c3969" (UID: "b9ea403a-5716-4424-bfeb-61537e7c3969"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.841187 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-config-data" (OuterVolumeSpecName: "config-data") pod "b9ea403a-5716-4424-bfeb-61537e7c3969" (UID: "b9ea403a-5716-4424-bfeb-61537e7c3969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.889447 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:26 crc kubenswrapper[4740]: I0105 14:12:26.889679 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9ea403a-5716-4424-bfeb-61537e7c3969-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.425138 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b9ea403a-5716-4424-bfeb-61537e7c3969","Type":"ContainerDied","Data":"70cb164cfca68fd94866ff3ebd2ff0aa8b5c92c56a44ab6eb1ac019dcd176a42"} Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.425194 4740 scope.go:117] "RemoveContainer" containerID="a7465aac3718aa934ac766c826ddef7fd8ef6f71f0022b16bf7cbd1d8a82ae1e" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.425264 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.539747 4740 scope.go:117] "RemoveContainer" containerID="cce08d473f01d1930853d089c569f7b9039ff8e9f5e57543e44aa5ead335d46a" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.585494 4740 scope.go:117] "RemoveContainer" containerID="9ae4cce6bba6e901e04c18948752dde5a7167997717b7292e1b23213a9580aae" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.594943 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.604785 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.615176 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 05 14:12:27 crc kubenswrapper[4740]: E0105 14:12:27.615839 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-api" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.615908 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-api" Jan 05 14:12:27 crc kubenswrapper[4740]: E0105 14:12:27.615978 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-notifier" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616039 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-notifier" Jan 05 14:12:27 crc kubenswrapper[4740]: E0105 14:12:27.616114 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-listener" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616176 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-listener" Jan 05 14:12:27 crc kubenswrapper[4740]: E0105 14:12:27.616237 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerName="dnsmasq-dns" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616285 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerName="dnsmasq-dns" Jan 05 14:12:27 crc kubenswrapper[4740]: E0105 14:12:27.616352 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-evaluator" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616402 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-evaluator" Jan 05 14:12:27 crc kubenswrapper[4740]: E0105 14:12:27.616500 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerName="init" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616550 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerName="init" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616679 4740 scope.go:117] "RemoveContainer" containerID="bb39eed9a7b3583dfe9836d671d14f898c313238abffbc81ada6878e3ed529df" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.616997 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-notifier" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.617305 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-listener" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.617416 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="69df31f0-004d-41d9-8c4c-c0bc865ff354" containerName="dnsmasq-dns" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.617476 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-api" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.617545 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" containerName="aodh-evaluator" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.620404 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.625780 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.625932 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.625976 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.626034 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.626258 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.626430 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f8s9l" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.709684 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-public-tls-certs\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.710240 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.710393 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64445\" (UniqueName: \"kubernetes.io/projected/96b6a192-1efc-47ce-9c5b-26539409d69c-kube-api-access-64445\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.710510 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-config-data\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.710612 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-internal-tls-certs\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.710704 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-scripts\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.812632 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.812746 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64445\" (UniqueName: \"kubernetes.io/projected/96b6a192-1efc-47ce-9c5b-26539409d69c-kube-api-access-64445\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.812824 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-config-data\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.812858 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-internal-tls-certs\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.812906 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-scripts\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.812974 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-public-tls-certs\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.821653 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-config-data\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.821872 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-scripts\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.821886 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.822263 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-public-tls-certs\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.822548 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-internal-tls-certs\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.845010 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64445\" (UniqueName: \"kubernetes.io/projected/96b6a192-1efc-47ce-9c5b-26539409d69c-kube-api-access-64445\") pod \"aodh-0\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " pod="openstack/aodh-0" Jan 05 14:12:27 crc kubenswrapper[4740]: I0105 14:12:27.956397 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:12:28 crc kubenswrapper[4740]: I0105 14:12:28.456294 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 05 14:12:28 crc kubenswrapper[4740]: W0105 14:12:28.466565 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96b6a192_1efc_47ce_9c5b_26539409d69c.slice/crio-390be8b058510812af0d6eb9aa8fcc8cc906da6e594756eced3a474e31b6fac6 WatchSource:0}: Error finding container 390be8b058510812af0d6eb9aa8fcc8cc906da6e594756eced3a474e31b6fac6: Status 404 returned error can't find the container with id 390be8b058510812af0d6eb9aa8fcc8cc906da6e594756eced3a474e31b6fac6 Jan 05 14:12:28 crc kubenswrapper[4740]: I0105 14:12:28.983676 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ea403a-5716-4424-bfeb-61537e7c3969" path="/var/lib/kubelet/pods/b9ea403a-5716-4424-bfeb-61537e7c3969/volumes" Jan 05 14:12:29 crc kubenswrapper[4740]: I0105 14:12:29.501337 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerStarted","Data":"390be8b058510812af0d6eb9aa8fcc8cc906da6e594756eced3a474e31b6fac6"} Jan 05 14:12:30 crc kubenswrapper[4740]: I0105 14:12:30.516083 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerStarted","Data":"cd8c32c9e8e2b0ea3fd487bd88c97bad3c4291d7986b61f79148c332b9ed1663"} Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.539754 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerStarted","Data":"278c2034c407eb00ceb629ad204f5ba2557b2d84ec7b3deca21971b273afa162"} Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.847941 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.848637 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.849327 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.850544 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.857581 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.860530 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.915226 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.915344 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.925800 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 05 14:12:31 crc kubenswrapper[4740]: I0105 14:12:31.930420 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 05 14:12:33 crc kubenswrapper[4740]: I0105 14:12:33.573913 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerStarted","Data":"87dda8af178e5c6e53129ed4da8013810d0bbed083a8d069aaca81ddd115d5b6"} Jan 05 14:12:36 crc kubenswrapper[4740]: I0105 14:12:36.610534 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerStarted","Data":"236541ca6123d7d5583d91aeb48255b7f86cc79c0fd2ca95738893b0d0a7d14f"} Jan 05 14:12:44 crc kubenswrapper[4740]: I0105 14:12:44.656679 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 05 14:12:44 crc kubenswrapper[4740]: I0105 14:12:44.691936 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=10.619488064 podStartE2EDuration="17.691909397s" podCreationTimestamp="2026-01-05 14:12:27 +0000 UTC" firstStartedPulling="2026-01-05 14:12:28.470559885 +0000 UTC m=+1397.777468464" lastFinishedPulling="2026-01-05 14:12:35.542981208 +0000 UTC m=+1404.849889797" observedRunningTime="2026-01-05 14:12:36.649895238 +0000 UTC m=+1405.956803817" watchObservedRunningTime="2026-01-05 14:12:44.691909397 +0000 UTC m=+1413.998817976" Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.307403 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.308193 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="81930e93-1484-4b87-9aeb-f05bd0de40b5" containerName="kube-state-metrics" containerID="cri-o://4c5137acd640422db442140325167da8d9c8a551020f5a81f47596086ad81bd4" gracePeriod=30 Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.373610 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.373839 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" containerName="mysqld-exporter" containerID="cri-o://7619f4bb3740d69d69acd1bc1b4226a393608569026b5a98ebb69115ee39acc6" gracePeriod=30 Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.778508 4740 generic.go:334] "Generic (PLEG): container finished" podID="19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" containerID="7619f4bb3740d69d69acd1bc1b4226a393608569026b5a98ebb69115ee39acc6" exitCode=2 Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.778772 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7","Type":"ContainerDied","Data":"7619f4bb3740d69d69acd1bc1b4226a393608569026b5a98ebb69115ee39acc6"} Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.780268 4740 generic.go:334] "Generic (PLEG): container finished" podID="81930e93-1484-4b87-9aeb-f05bd0de40b5" containerID="4c5137acd640422db442140325167da8d9c8a551020f5a81f47596086ad81bd4" exitCode=2 Jan 05 14:12:48 crc kubenswrapper[4740]: I0105 14:12:48.780296 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"81930e93-1484-4b87-9aeb-f05bd0de40b5","Type":"ContainerDied","Data":"4c5137acd640422db442140325167da8d9c8a551020f5a81f47596086ad81bd4"} Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.097255 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.108580 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.123116 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smc9r\" (UniqueName: \"kubernetes.io/projected/81930e93-1484-4b87-9aeb-f05bd0de40b5-kube-api-access-smc9r\") pod \"81930e93-1484-4b87-9aeb-f05bd0de40b5\" (UID: \"81930e93-1484-4b87-9aeb-f05bd0de40b5\") " Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.130959 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81930e93-1484-4b87-9aeb-f05bd0de40b5-kube-api-access-smc9r" (OuterVolumeSpecName: "kube-api-access-smc9r") pod "81930e93-1484-4b87-9aeb-f05bd0de40b5" (UID: "81930e93-1484-4b87-9aeb-f05bd0de40b5"). InnerVolumeSpecName "kube-api-access-smc9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.234673 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-combined-ca-bundle\") pod \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.234736 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz2xg\" (UniqueName: \"kubernetes.io/projected/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-kube-api-access-tz2xg\") pod \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.234883 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-config-data\") pod \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\" (UID: \"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7\") " Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.235683 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smc9r\" (UniqueName: \"kubernetes.io/projected/81930e93-1484-4b87-9aeb-f05bd0de40b5-kube-api-access-smc9r\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.240755 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-kube-api-access-tz2xg" (OuterVolumeSpecName: "kube-api-access-tz2xg") pod "19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" (UID: "19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7"). InnerVolumeSpecName "kube-api-access-tz2xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.280427 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" (UID: "19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.305761 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-config-data" (OuterVolumeSpecName: "config-data") pod "19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" (UID: "19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.340727 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.340757 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.340770 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz2xg\" (UniqueName: \"kubernetes.io/projected/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7-kube-api-access-tz2xg\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.795306 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7","Type":"ContainerDied","Data":"426ab1585700c30f90e2d09ac12f667bc3fd58186ccf6fbd196bd7462616cc1e"} Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.795388 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.795628 4740 scope.go:117] "RemoveContainer" containerID="7619f4bb3740d69d69acd1bc1b4226a393608569026b5a98ebb69115ee39acc6" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.797081 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"81930e93-1484-4b87-9aeb-f05bd0de40b5","Type":"ContainerDied","Data":"5748fa50e42eb88beb38468614598425fdc88467bf853e4b3c4a433d12a2795d"} Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.797164 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.854440 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.874361 4740 scope.go:117] "RemoveContainer" containerID="4c5137acd640422db442140325167da8d9c8a551020f5a81f47596086ad81bd4" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.876558 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.916208 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: E0105 14:12:49.916783 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" containerName="mysqld-exporter" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.916803 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" containerName="mysqld-exporter" Jan 05 14:12:49 crc kubenswrapper[4740]: E0105 14:12:49.916822 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81930e93-1484-4b87-9aeb-f05bd0de40b5" containerName="kube-state-metrics" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.916830 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="81930e93-1484-4b87-9aeb-f05bd0de40b5" containerName="kube-state-metrics" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.917120 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" containerName="mysqld-exporter" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.917149 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="81930e93-1484-4b87-9aeb-f05bd0de40b5" containerName="kube-state-metrics" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.917971 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.920136 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.922523 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.932297 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.950497 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.964244 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq772\" (UniqueName: \"kubernetes.io/projected/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-api-access-tq772\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.964285 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.964337 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.964411 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.964557 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.977532 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.980435 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.983041 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Jan 05 14:12:49 crc kubenswrapper[4740]: I0105 14:12:49.984240 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.001287 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.066605 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.066707 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.066789 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.066825 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-config-data\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.067134 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq772\" (UniqueName: \"kubernetes.io/projected/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-api-access-tq772\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.067197 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.067281 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdkw\" (UniqueName: \"kubernetes.io/projected/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-kube-api-access-jhdkw\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.067424 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.071441 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.071681 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.073390 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.087707 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq772\" (UniqueName: \"kubernetes.io/projected/6d1b6a83-0692-4de9-8d5f-56f4371b9d22-kube-api-access-tq772\") pod \"kube-state-metrics-0\" (UID: \"6d1b6a83-0692-4de9-8d5f-56f4371b9d22\") " pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.170287 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.170389 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.170417 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-config-data\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.170462 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdkw\" (UniqueName: \"kubernetes.io/projected/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-kube-api-access-jhdkw\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.174897 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-config-data\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.175405 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.175745 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.190340 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdkw\" (UniqueName: \"kubernetes.io/projected/4b6027cc-1afc-468f-9a91-6c1f3f844ba2-kube-api-access-jhdkw\") pod \"mysqld-exporter-0\" (UID: \"4b6027cc-1afc-468f-9a91-6c1f3f844ba2\") " pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.236957 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.305779 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.606039 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.606933 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-central-agent" containerID="cri-o://e1e9b1fce0283d30a03b8f07252671b21169cb2d8c83348559c49139fb8c4386" gracePeriod=30 Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.607675 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="proxy-httpd" containerID="cri-o://1da9c538825a5a7615d10f0d492f1312807aac4394eabab3b468d19fc7303f25" gracePeriod=30 Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.607790 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-notification-agent" containerID="cri-o://681acffd14d38ee97ffaa04619d8ca2e02a1a221f2a07acb2863bbe1285ad428" gracePeriod=30 Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.607843 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="sg-core" containerID="cri-o://81006f9e448378ee3c82b701651a413f1f481b235662dcd15a30d0ef724a6fef" gracePeriod=30 Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.770485 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 05 14:12:50 crc kubenswrapper[4740]: W0105 14:12:50.774868 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d1b6a83_0692_4de9_8d5f_56f4371b9d22.slice/crio-494f6eaa6f9813f934d7a00a1f01527ee3022b54bfb0242ac1a63b76c276fbc7 WatchSource:0}: Error finding container 494f6eaa6f9813f934d7a00a1f01527ee3022b54bfb0242ac1a63b76c276fbc7: Status 404 returned error can't find the container with id 494f6eaa6f9813f934d7a00a1f01527ee3022b54bfb0242ac1a63b76c276fbc7 Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.780546 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.809912 4740 generic.go:334] "Generic (PLEG): container finished" podID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerID="81006f9e448378ee3c82b701651a413f1f481b235662dcd15a30d0ef724a6fef" exitCode=2 Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.810215 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerDied","Data":"81006f9e448378ee3c82b701651a413f1f481b235662dcd15a30d0ef724a6fef"} Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.813745 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6d1b6a83-0692-4de9-8d5f-56f4371b9d22","Type":"ContainerStarted","Data":"494f6eaa6f9813f934d7a00a1f01527ee3022b54bfb0242ac1a63b76c276fbc7"} Jan 05 14:12:50 crc kubenswrapper[4740]: I0105 14:12:50.891410 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Jan 05 14:12:50 crc kubenswrapper[4740]: W0105 14:12:50.893450 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6027cc_1afc_468f_9a91_6c1f3f844ba2.slice/crio-8919daf2f99c57d2bb5e23a3955346153288b4bb7c6ab6c0246ee257a290d699 WatchSource:0}: Error finding container 8919daf2f99c57d2bb5e23a3955346153288b4bb7c6ab6c0246ee257a290d699: Status 404 returned error can't find the container with id 8919daf2f99c57d2bb5e23a3955346153288b4bb7c6ab6c0246ee257a290d699 Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.012292 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7" path="/var/lib/kubelet/pods/19bc5ab9-cfc5-4d1b-8082-ad6c2edf59f7/volumes" Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.012992 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81930e93-1484-4b87-9aeb-f05bd0de40b5" path="/var/lib/kubelet/pods/81930e93-1484-4b87-9aeb-f05bd0de40b5/volumes" Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.827309 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4b6027cc-1afc-468f-9a91-6c1f3f844ba2","Type":"ContainerStarted","Data":"8919daf2f99c57d2bb5e23a3955346153288b4bb7c6ab6c0246ee257a290d699"} Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.834284 4740 generic.go:334] "Generic (PLEG): container finished" podID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerID="1da9c538825a5a7615d10f0d492f1312807aac4394eabab3b468d19fc7303f25" exitCode=0 Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.834330 4740 generic.go:334] "Generic (PLEG): container finished" podID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerID="e1e9b1fce0283d30a03b8f07252671b21169cb2d8c83348559c49139fb8c4386" exitCode=0 Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.834370 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerDied","Data":"1da9c538825a5a7615d10f0d492f1312807aac4394eabab3b468d19fc7303f25"} Jan 05 14:12:51 crc kubenswrapper[4740]: I0105 14:12:51.834455 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerDied","Data":"e1e9b1fce0283d30a03b8f07252671b21169cb2d8c83348559c49139fb8c4386"} Jan 05 14:12:52 crc kubenswrapper[4740]: I0105 14:12:52.846917 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4b6027cc-1afc-468f-9a91-6c1f3f844ba2","Type":"ContainerStarted","Data":"55cc93962a80810ba66646f663e34257f3b41775df847f5d5aeed1452cdd5ad7"} Jan 05 14:12:52 crc kubenswrapper[4740]: I0105 14:12:52.849629 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6d1b6a83-0692-4de9-8d5f-56f4371b9d22","Type":"ContainerStarted","Data":"3f5843d4d6b80b68226f66d2d228ee2f70402b1f746a8c5f1084b5419675e0ff"} Jan 05 14:12:52 crc kubenswrapper[4740]: I0105 14:12:52.849765 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 05 14:12:52 crc kubenswrapper[4740]: I0105 14:12:52.867767 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.086254036 podStartE2EDuration="3.867749695s" podCreationTimestamp="2026-01-05 14:12:49 +0000 UTC" firstStartedPulling="2026-01-05 14:12:50.895905242 +0000 UTC m=+1420.202813831" lastFinishedPulling="2026-01-05 14:12:51.677400911 +0000 UTC m=+1420.984309490" observedRunningTime="2026-01-05 14:12:52.867395586 +0000 UTC m=+1422.174304165" watchObservedRunningTime="2026-01-05 14:12:52.867749695 +0000 UTC m=+1422.174658284" Jan 05 14:12:52 crc kubenswrapper[4740]: I0105 14:12:52.889101 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.154024901 podStartE2EDuration="3.889062499s" podCreationTimestamp="2026-01-05 14:12:49 +0000 UTC" firstStartedPulling="2026-01-05 14:12:50.780240142 +0000 UTC m=+1420.087148731" lastFinishedPulling="2026-01-05 14:12:51.51527773 +0000 UTC m=+1420.822186329" observedRunningTime="2026-01-05 14:12:52.888614476 +0000 UTC m=+1422.195523075" watchObservedRunningTime="2026-01-05 14:12:52.889062499 +0000 UTC m=+1422.195971078" Jan 05 14:12:55 crc kubenswrapper[4740]: I0105 14:12:55.895806 4740 generic.go:334] "Generic (PLEG): container finished" podID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerID="681acffd14d38ee97ffaa04619d8ca2e02a1a221f2a07acb2863bbe1285ad428" exitCode=0 Jan 05 14:12:55 crc kubenswrapper[4740]: I0105 14:12:55.895886 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerDied","Data":"681acffd14d38ee97ffaa04619d8ca2e02a1a221f2a07acb2863bbe1285ad428"} Jan 05 14:12:55 crc kubenswrapper[4740]: I0105 14:12:55.896648 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71b00767-28bf-4158-9f4e-bb3c5dafd7d5","Type":"ContainerDied","Data":"f4f84c5c0207eb0e7b773789bd867072f97d162d913663f89ab61a1d967efd08"} Jan 05 14:12:55 crc kubenswrapper[4740]: I0105 14:12:55.896675 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f84c5c0207eb0e7b773789bd867072f97d162d913663f89ab61a1d967efd08" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.056828 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.243212 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-run-httpd\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.243680 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.243906 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgnl9\" (UniqueName: \"kubernetes.io/projected/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-kube-api-access-mgnl9\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.244633 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-log-httpd\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.244700 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-scripts\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.244784 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-combined-ca-bundle\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.244959 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-sg-core-conf-yaml\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.244992 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-config-data\") pod \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\" (UID: \"71b00767-28bf-4158-9f4e-bb3c5dafd7d5\") " Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.245466 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.246476 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.246501 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.251231 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-kube-api-access-mgnl9" (OuterVolumeSpecName: "kube-api-access-mgnl9") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "kube-api-access-mgnl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.253821 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-scripts" (OuterVolumeSpecName: "scripts") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.278042 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.348991 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.349025 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgnl9\" (UniqueName: \"kubernetes.io/projected/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-kube-api-access-mgnl9\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.349037 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.361278 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.448679 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-config-data" (OuterVolumeSpecName: "config-data") pod "71b00767-28bf-4158-9f4e-bb3c5dafd7d5" (UID: "71b00767-28bf-4158-9f4e-bb3c5dafd7d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.458980 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.459027 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b00767-28bf-4158-9f4e-bb3c5dafd7d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.907109 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.980954 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.984199 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.995437 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:56 crc kubenswrapper[4740]: E0105 14:12:56.996098 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="proxy-httpd" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996122 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="proxy-httpd" Jan 05 14:12:56 crc kubenswrapper[4740]: E0105 14:12:56.996142 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="sg-core" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996151 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="sg-core" Jan 05 14:12:56 crc kubenswrapper[4740]: E0105 14:12:56.996199 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-notification-agent" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996208 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-notification-agent" Jan 05 14:12:56 crc kubenswrapper[4740]: E0105 14:12:56.996241 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-central-agent" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996250 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-central-agent" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996548 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="sg-core" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996580 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="proxy-httpd" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996610 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-central-agent" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.996626 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" containerName="ceilometer-notification-agent" Jan 05 14:12:56 crc kubenswrapper[4740]: I0105 14:12:56.999457 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.001564 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.001838 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.002026 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.018820 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.175842 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-log-httpd\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.176520 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65q9c\" (UniqueName: \"kubernetes.io/projected/dc9693bc-68c2-4ba7-bd23-95303bf357d4-kube-api-access-65q9c\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.176713 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.176944 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-scripts\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.177270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-run-httpd\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.177378 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-config-data\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.177446 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.177713 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.280961 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-config-data\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.281090 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.281223 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.281286 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-log-httpd\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.281817 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65q9c\" (UniqueName: \"kubernetes.io/projected/dc9693bc-68c2-4ba7-bd23-95303bf357d4-kube-api-access-65q9c\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.281908 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.281962 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-scripts\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.282051 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-log-httpd\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.282205 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-run-httpd\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.282782 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-run-httpd\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.287033 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.289913 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-config-data\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.293594 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.300595 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.303270 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-scripts\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.314344 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65q9c\" (UniqueName: \"kubernetes.io/projected/dc9693bc-68c2-4ba7-bd23-95303bf357d4-kube-api-access-65q9c\") pod \"ceilometer-0\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.323916 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.818834 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.887749 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-979vn"] Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.899506 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-979vn"] Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.941969 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerStarted","Data":"236285c7654e124741e33763eb8873aea4a3f4abf2b4dbff4cf9e286ef51fca2"} Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.943184 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-snmgh"] Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.944905 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:57 crc kubenswrapper[4740]: I0105 14:12:57.955117 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-snmgh"] Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.104768 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-896mw\" (UniqueName: \"kubernetes.io/projected/04ef5055-0453-443e-b7cc-8fd93041e899-kube-api-access-896mw\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.104935 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-combined-ca-bundle\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.104976 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-config-data\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.207357 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-config-data\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.207553 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-896mw\" (UniqueName: \"kubernetes.io/projected/04ef5055-0453-443e-b7cc-8fd93041e899-kube-api-access-896mw\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.207842 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-combined-ca-bundle\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.213759 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-config-data\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.214792 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-combined-ca-bundle\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.238123 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-896mw\" (UniqueName: \"kubernetes.io/projected/04ef5055-0453-443e-b7cc-8fd93041e899-kube-api-access-896mw\") pod \"heat-db-sync-snmgh\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.273164 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-snmgh" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.762162 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-snmgh"] Jan 05 14:12:58 crc kubenswrapper[4740]: W0105 14:12:58.769227 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ef5055_0453_443e_b7cc_8fd93041e899.slice/crio-af84f7bdb27e754daa46359e0b647d0080b538dff4c95d437ac72bddfb2187f1 WatchSource:0}: Error finding container af84f7bdb27e754daa46359e0b647d0080b538dff4c95d437ac72bddfb2187f1: Status 404 returned error can't find the container with id af84f7bdb27e754daa46359e0b647d0080b538dff4c95d437ac72bddfb2187f1 Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.954865 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-snmgh" event={"ID":"04ef5055-0453-443e-b7cc-8fd93041e899","Type":"ContainerStarted","Data":"af84f7bdb27e754daa46359e0b647d0080b538dff4c95d437ac72bddfb2187f1"} Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.984417 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b00767-28bf-4158-9f4e-bb3c5dafd7d5" path="/var/lib/kubelet/pods/71b00767-28bf-4158-9f4e-bb3c5dafd7d5/volumes" Jan 05 14:12:58 crc kubenswrapper[4740]: I0105 14:12:58.985482 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5f3fd5-cc0b-42c1-8cee-b28452adde1d" path="/var/lib/kubelet/pods/ed5f3fd5-cc0b-42c1-8cee-b28452adde1d/volumes" Jan 05 14:12:59 crc kubenswrapper[4740]: I0105 14:12:59.972803 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerStarted","Data":"e32cf6677fb895f182f3e9e0a95ab82c117a7cd76e598df2ce122b40bcca02af"} Jan 05 14:13:00 crc kubenswrapper[4740]: I0105 14:13:00.248431 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 05 14:13:00 crc kubenswrapper[4740]: I0105 14:13:00.389979 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:13:00 crc kubenswrapper[4740]: I0105 14:13:00.998920 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerStarted","Data":"b6d556bbbeb18f99289f6c671068a9eb759315adfd94edb9c5d3468bfb17b809"} Jan 05 14:13:01 crc kubenswrapper[4740]: I0105 14:13:01.245518 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:13:01 crc kubenswrapper[4740]: I0105 14:13:01.399784 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:13:01 crc kubenswrapper[4740]: I0105 14:13:01.916256 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:13:01 crc kubenswrapper[4740]: I0105 14:13:01.916304 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:13:02 crc kubenswrapper[4740]: I0105 14:13:02.011998 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerStarted","Data":"a0ddb2c692199ed5095bd63a18921b02a751c2280dc066e4b3a7c43794a23669"} Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.053139 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerStarted","Data":"1a6e00611414c14aab7331a0c71e84b92444d92f09c89e6f118a09ff98ef5eea"} Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.053543 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-central-agent" containerID="cri-o://e32cf6677fb895f182f3e9e0a95ab82c117a7cd76e598df2ce122b40bcca02af" gracePeriod=30 Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.053768 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.054129 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="proxy-httpd" containerID="cri-o://1a6e00611414c14aab7331a0c71e84b92444d92f09c89e6f118a09ff98ef5eea" gracePeriod=30 Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.054183 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="sg-core" containerID="cri-o://a0ddb2c692199ed5095bd63a18921b02a751c2280dc066e4b3a7c43794a23669" gracePeriod=30 Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.054229 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-notification-agent" containerID="cri-o://b6d556bbbeb18f99289f6c671068a9eb759315adfd94edb9c5d3468bfb17b809" gracePeriod=30 Jan 05 14:13:04 crc kubenswrapper[4740]: I0105 14:13:04.158537 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.097774212 podStartE2EDuration="8.15851926s" podCreationTimestamp="2026-01-05 14:12:56 +0000 UTC" firstStartedPulling="2026-01-05 14:12:57.823177052 +0000 UTC m=+1427.130085631" lastFinishedPulling="2026-01-05 14:13:02.8839221 +0000 UTC m=+1432.190830679" observedRunningTime="2026-01-05 14:13:04.151181843 +0000 UTC m=+1433.458090432" watchObservedRunningTime="2026-01-05 14:13:04.15851926 +0000 UTC m=+1433.465427839" Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.072562 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerID="1a6e00611414c14aab7331a0c71e84b92444d92f09c89e6f118a09ff98ef5eea" exitCode=0 Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.072870 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerID="a0ddb2c692199ed5095bd63a18921b02a751c2280dc066e4b3a7c43794a23669" exitCode=2 Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.072879 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerID="b6d556bbbeb18f99289f6c671068a9eb759315adfd94edb9c5d3468bfb17b809" exitCode=0 Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.072644 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerDied","Data":"1a6e00611414c14aab7331a0c71e84b92444d92f09c89e6f118a09ff98ef5eea"} Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.072918 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerDied","Data":"a0ddb2c692199ed5095bd63a18921b02a751c2280dc066e4b3a7c43794a23669"} Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.072933 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerDied","Data":"b6d556bbbeb18f99289f6c671068a9eb759315adfd94edb9c5d3468bfb17b809"} Jan 05 14:13:05 crc kubenswrapper[4740]: I0105 14:13:05.773090 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" containerID="cri-o://dd161248466ee88f4ab07ec76467467275c500b0f8e46abc31cd22b51e46ddcf" gracePeriod=604795 Jan 05 14:13:06 crc kubenswrapper[4740]: I0105 14:13:06.296567 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" containerID="cri-o://0ac551bcb4a2ac3c200ba4459a08bc635b2416c696711644844f2e4d08c36508" gracePeriod=604796 Jan 05 14:13:06 crc kubenswrapper[4740]: I0105 14:13:06.678010 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Jan 05 14:13:06 crc kubenswrapper[4740]: I0105 14:13:06.766794 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.138:5671: connect: connection refused" Jan 05 14:13:08 crc kubenswrapper[4740]: I0105 14:13:08.125662 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerID="e32cf6677fb895f182f3e9e0a95ab82c117a7cd76e598df2ce122b40bcca02af" exitCode=0 Jan 05 14:13:08 crc kubenswrapper[4740]: I0105 14:13:08.125736 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerDied","Data":"e32cf6677fb895f182f3e9e0a95ab82c117a7cd76e598df2ce122b40bcca02af"} Jan 05 14:13:12 crc kubenswrapper[4740]: I0105 14:13:12.189365 4740 generic.go:334] "Generic (PLEG): container finished" podID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerID="dd161248466ee88f4ab07ec76467467275c500b0f8e46abc31cd22b51e46ddcf" exitCode=0 Jan 05 14:13:12 crc kubenswrapper[4740]: I0105 14:13:12.189540 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eeb4c870-b0d8-4d92-82c1-aedb35200c4b","Type":"ContainerDied","Data":"dd161248466ee88f4ab07ec76467467275c500b0f8e46abc31cd22b51e46ddcf"} Jan 05 14:13:12 crc kubenswrapper[4740]: E0105 14:13:12.492440 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c80dba_b90f_4288_a366_4ff77f76db22.slice/crio-0ac551bcb4a2ac3c200ba4459a08bc635b2416c696711644844f2e4d08c36508.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:13:13 crc kubenswrapper[4740]: I0105 14:13:13.209569 4740 generic.go:334] "Generic (PLEG): container finished" podID="54c80dba-b90f-4288-a366-4ff77f76db22" containerID="0ac551bcb4a2ac3c200ba4459a08bc635b2416c696711644844f2e4d08c36508" exitCode=0 Jan 05 14:13:13 crc kubenswrapper[4740]: I0105 14:13:13.209661 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"54c80dba-b90f-4288-a366-4ff77f76db22","Type":"ContainerDied","Data":"0ac551bcb4a2ac3c200ba4459a08bc635b2416c696711644844f2e4d08c36508"} Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.425647 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-2ff28"] Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.428384 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.431820 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.436980 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-2ff28"] Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.537946 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-config\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.538140 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.538195 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.538229 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.538280 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.538372 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvn5j\" (UniqueName: \"kubernetes.io/projected/0815c1c2-7fdf-4609-8de3-34aa56424c8a-kube-api-access-jvn5j\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.538435 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641214 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvn5j\" (UniqueName: \"kubernetes.io/projected/0815c1c2-7fdf-4609-8de3-34aa56424c8a-kube-api-access-jvn5j\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641305 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641454 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-config\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641545 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641580 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641629 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.641672 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.642621 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-config\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.642721 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.642814 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.643300 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.643510 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.645055 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.665816 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvn5j\" (UniqueName: \"kubernetes.io/projected/0815c1c2-7fdf-4609-8de3-34aa56424c8a-kube-api-access-jvn5j\") pod \"dnsmasq-dns-5b75489c6f-2ff28\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:14 crc kubenswrapper[4740]: I0105 14:13:14.770678 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:16 crc kubenswrapper[4740]: I0105 14:13:16.677861 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Jan 05 14:13:16 crc kubenswrapper[4740]: I0105 14:13:16.767734 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.138:5671: connect: connection refused" Jan 05 14:13:17 crc kubenswrapper[4740]: I0105 14:13:17.821531 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 14:13:17 crc kubenswrapper[4740]: I0105 14:13:17.821618 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 14:13:20 crc kubenswrapper[4740]: I0105 14:13:20.430274 4740 patch_prober.go:28] interesting pod/loki-operator-controller-manager-56d45b676b-q44gh container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 14:13:20 crc kubenswrapper[4740]: I0105 14:13:20.430757 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" podUID="50147f9c-3a52-4e0e-b0cc-1fd94e7def10" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 14:13:26 crc kubenswrapper[4740]: I0105 14:13:26.678181 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Jan 05 14:13:26 crc kubenswrapper[4740]: I0105 14:13:26.678946 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Jan 05 14:13:26 crc kubenswrapper[4740]: I0105 14:13:26.766784 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.138:5671: connect: connection refused" Jan 05 14:13:26 crc kubenswrapper[4740]: I0105 14:13:26.767256 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:27 crc kubenswrapper[4740]: E0105 14:13:27.845594 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Jan 05 14:13:27 crc kubenswrapper[4740]: E0105 14:13:27.845853 4740 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Jan 05 14:13:27 crc kubenswrapper[4740]: E0105 14:13:27.845963 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-896mw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-snmgh_openstack(04ef5055-0453-443e-b7cc-8fd93041e899): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 14:13:27 crc kubenswrapper[4740]: E0105 14:13:27.847340 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-snmgh" podUID="04ef5055-0453-443e-b7cc-8fd93041e899" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.006925 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127604 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-scripts\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127699 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-config-data\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127762 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-sg-core-conf-yaml\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127826 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-log-httpd\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127877 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-combined-ca-bundle\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127904 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65q9c\" (UniqueName: \"kubernetes.io/projected/dc9693bc-68c2-4ba7-bd23-95303bf357d4-kube-api-access-65q9c\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127926 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-run-httpd\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.127982 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-ceilometer-tls-certs\") pod \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\" (UID: \"dc9693bc-68c2-4ba7-bd23-95303bf357d4\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.129880 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.130142 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.134770 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9693bc-68c2-4ba7-bd23-95303bf357d4-kube-api-access-65q9c" (OuterVolumeSpecName: "kube-api-access-65q9c") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "kube-api-access-65q9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.142450 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-scripts" (OuterVolumeSpecName: "scripts") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.228131 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.230689 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.231168 4740 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.231356 4740 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.231758 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65q9c\" (UniqueName: \"kubernetes.io/projected/dc9693bc-68c2-4ba7-bd23-95303bf357d4-kube-api-access-65q9c\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.231950 4740 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc9693bc-68c2-4ba7-bd23-95303bf357d4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.235220 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.293372 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.324353 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-config-data" (OuterVolumeSpecName: "config-data") pod "dc9693bc-68c2-4ba7-bd23-95303bf357d4" (UID: "dc9693bc-68c2-4ba7-bd23-95303bf357d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.335369 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.342576 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.342616 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9693bc-68c2-4ba7-bd23-95303bf357d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.401530 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.412370 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.430558 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.430628 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc9693bc-68c2-4ba7-bd23-95303bf357d4","Type":"ContainerDied","Data":"236285c7654e124741e33763eb8873aea4a3f4abf2b4dbff4cf9e286ef51fca2"} Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.430680 4740 scope.go:117] "RemoveContainer" containerID="1a6e00611414c14aab7331a0c71e84b92444d92f09c89e6f118a09ff98ef5eea" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.439998 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"54c80dba-b90f-4288-a366-4ff77f76db22","Type":"ContainerDied","Data":"a27ec4ba6f23396161006fb1834eb46af6199f739eac243960fc45cb043e187b"} Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.440231 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.443553 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-pod-info\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.443594 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54c80dba-b90f-4288-a366-4ff77f76db22-erlang-cookie-secret\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.443639 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m9b7\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-kube-api-access-2m9b7\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.446093 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.446511 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eeb4c870-b0d8-4d92-82c1-aedb35200c4b","Type":"ContainerDied","Data":"fb4dceae4dea151c88f744052f4169082814f68c78a72e4e8329fe08490ab23f"} Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449029 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449158 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-confd\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449183 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-plugins-conf\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449211 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-erlang-cookie\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449253 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-tls\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449318 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxvnp\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-kube-api-access-jxvnp\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.449350 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-erlang-cookie\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.450174 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c80dba-b90f-4288-a366-4ff77f76db22-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.450686 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.451310 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-kube-api-access-2m9b7" (OuterVolumeSpecName: "kube-api-access-2m9b7") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "kube-api-access-2m9b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.452205 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454209 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454302 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-plugins\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454364 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-plugins-conf\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454427 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-confd\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454452 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-tls\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454500 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-plugins\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454520 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-config-data\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454556 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-server-conf\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454579 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-server-conf\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454616 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54c80dba-b90f-4288-a366-4ff77f76db22-pod-info\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454732 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-erlang-cookie-secret\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454775 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-config-data\") pod \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\" (UID: \"eeb4c870-b0d8-4d92-82c1-aedb35200c4b\") " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.454777 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455392 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455798 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455818 4740 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/54c80dba-b90f-4288-a366-4ff77f76db22-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455831 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m9b7\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-kube-api-access-2m9b7\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455845 4740 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455857 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.455868 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.456609 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.459029 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.472746 4740 scope.go:117] "RemoveContainer" containerID="a0ddb2c692199ed5095bd63a18921b02a751c2280dc066e4b3a7c43794a23669" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.472900 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-snmgh" podUID="04ef5055-0453-443e-b7cc-8fd93041e899" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.475480 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/54c80dba-b90f-4288-a366-4ff77f76db22-pod-info" (OuterVolumeSpecName: "pod-info") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.475738 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-pod-info" (OuterVolumeSpecName: "pod-info") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.480663 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.480672 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.481087 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.520848 4740 scope.go:117] "RemoveContainer" containerID="b6d556bbbeb18f99289f6c671068a9eb759315adfd94edb9c5d3468bfb17b809" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.533255 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-kube-api-access-jxvnp" (OuterVolumeSpecName: "kube-api-access-jxvnp") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "kube-api-access-jxvnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.551165 4740 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb podName:54c80dba-b90f-4288-a366-4ff77f76db22 nodeName:}" failed. No retries permitted until 2026-01-05 14:13:29.051139041 +0000 UTC m=+1458.358047620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "persistence" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557909 4740 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557937 4740 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-pod-info\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557947 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557957 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557966 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxvnp\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-kube-api-access-jxvnp\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557974 4740 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557982 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.557990 4740 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/54c80dba-b90f-4288-a366-4ff77f76db22-pod-info\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.558918 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.564896 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d" (OuterVolumeSpecName: "persistence") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.577319 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-config-data" (OuterVolumeSpecName: "config-data") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.578939 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-config-data" (OuterVolumeSpecName: "config-data") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.582281 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.582373 4740 scope.go:117] "RemoveContainer" containerID="e32cf6677fb895f182f3e9e0a95ab82c117a7cd76e598df2ce122b40bcca02af" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.602992 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-server-conf" (OuterVolumeSpecName: "server-conf") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.615005 4740 scope.go:117] "RemoveContainer" containerID="0ac551bcb4a2ac3c200ba4459a08bc635b2416c696711644844f2e4d08c36508" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.636862 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637362 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-notification-agent" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637379 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-notification-agent" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637397 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-central-agent" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637404 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-central-agent" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637413 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637420 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637437 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="setup-container" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637443 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="setup-container" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637459 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="setup-container" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637465 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="setup-container" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637491 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637497 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637511 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="sg-core" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637516 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="sg-core" Jan 05 14:13:28 crc kubenswrapper[4740]: E0105 14:13:28.637527 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="proxy-httpd" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637533 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="proxy-httpd" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637731 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-notification-agent" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637747 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="proxy-httpd" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637759 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" containerName="rabbitmq" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637771 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="sg-core" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637796 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" containerName="rabbitmq" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.637806 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="ceilometer-central-agent" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.640214 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.643200 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.643346 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.643454 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.663288 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.663319 4740 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-server-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.663356 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.663396 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") on node \"crc\" " Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.663698 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-server-conf" (OuterVolumeSpecName: "server-conf") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.668371 4740 scope.go:117] "RemoveContainer" containerID="1c047226985f396d19c238260e8d4fcd5bd4861cf84ef028ed887a1b0fa068b9" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.675373 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.701615 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-2ff28"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.707270 4740 scope.go:117] "RemoveContainer" containerID="dd161248466ee88f4ab07ec76467467275c500b0f8e46abc31cd22b51e46ddcf" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.719860 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.720021 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d") on node "crc" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.721759 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.724272 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "eeb4c870-b0d8-4d92-82c1-aedb35200c4b" (UID: "eeb4c870-b0d8-4d92-82c1-aedb35200c4b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.759967 4740 scope.go:117] "RemoveContainer" containerID="b3dbdf8362bc625ca04348e9bb74d68d3f768dd9528dbaeca36d3f3a321e26f6" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.765736 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.765789 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zptbv\" (UniqueName: \"kubernetes.io/projected/2b922e17-0ca9-49cd-8af7-b78776b990bb-kube-api-access-zptbv\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.766014 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b922e17-0ca9-49cd-8af7-b78776b990bb-run-httpd\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.766165 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.766213 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-scripts\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.766330 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-config-data\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.766485 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.766524 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b922e17-0ca9-49cd-8af7-b78776b990bb-log-httpd\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.769346 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.769375 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eeb4c870-b0d8-4d92-82c1-aedb35200c4b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.769386 4740 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/54c80dba-b90f-4288-a366-4ff77f76db22-server-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.769398 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/54c80dba-b90f-4288-a366-4ff77f76db22-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.790009 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.815178 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.840593 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.842745 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.875685 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b922e17-0ca9-49cd-8af7-b78776b990bb-run-httpd\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.875758 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.875790 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-scripts\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.875847 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-config-data\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.875892 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.875915 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b922e17-0ca9-49cd-8af7-b78776b990bb-log-httpd\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.876124 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.876160 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zptbv\" (UniqueName: \"kubernetes.io/projected/2b922e17-0ca9-49cd-8af7-b78776b990bb-kube-api-access-zptbv\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.876798 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b922e17-0ca9-49cd-8af7-b78776b990bb-run-httpd\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.883828 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-scripts\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.884756 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-config-data\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.885712 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.885787 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.885971 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b922e17-0ca9-49cd-8af7-b78776b990bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.887645 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b922e17-0ca9-49cd-8af7-b78776b990bb-log-httpd\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.889870 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.891376 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zptbv\" (UniqueName: \"kubernetes.io/projected/2b922e17-0ca9-49cd-8af7-b78776b990bb-kube-api-access-zptbv\") pod \"ceilometer-0\" (UID: \"2b922e17-0ca9-49cd-8af7-b78776b990bb\") " pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.969000 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.977874 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8150ac8e-6303-4af1-8a21-6fb434df508b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.977925 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.977976 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.977999 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-config-data\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978020 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978049 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978090 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978115 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978179 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsp97\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-kube-api-access-zsp97\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978253 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8150ac8e-6303-4af1-8a21-6fb434df508b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.978275 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.982350 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" path="/var/lib/kubelet/pods/dc9693bc-68c2-4ba7-bd23-95303bf357d4/volumes" Jan 05 14:13:28 crc kubenswrapper[4740]: I0105 14:13:28.983419 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb4c870-b0d8-4d92-82c1-aedb35200c4b" path="/var/lib/kubelet/pods/eeb4c870-b0d8-4d92-82c1-aedb35200c4b/volumes" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.079691 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"54c80dba-b90f-4288-a366-4ff77f76db22\" (UID: \"54c80dba-b90f-4288-a366-4ff77f76db22\") " Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080381 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-config-data\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080414 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080471 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080513 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080558 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080623 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsp97\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-kube-api-access-zsp97\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080716 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8150ac8e-6303-4af1-8a21-6fb434df508b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080750 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080826 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8150ac8e-6303-4af1-8a21-6fb434df508b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.080862 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.088597 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.088849 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.089514 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.089908 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.093267 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.093623 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8150ac8e-6303-4af1-8a21-6fb434df508b-config-data\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.094336 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.094356 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2f7b9b5abaa8990d589f1063e4eaf9d9b88b37cbfdc58c57397865c99b07d3b/globalmount\"" pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.110705 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsp97\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-kube-api-access-zsp97\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.114909 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8150ac8e-6303-4af1-8a21-6fb434df508b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.115586 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb" (OuterVolumeSpecName: "persistence") pod "54c80dba-b90f-4288-a366-4ff77f76db22" (UID: "54c80dba-b90f-4288-a366-4ff77f76db22"). InnerVolumeSpecName "pvc-0142779e-9b14-4987-927f-6abae3becbdb". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.116479 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8150ac8e-6303-4af1-8a21-6fb434df508b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.116500 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8150ac8e-6303-4af1-8a21-6fb434df508b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.183207 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") on node \"crc\" " Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.190045 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dd9c86cd-4cac-44eb-8fc8-51a2868a884d\") pod \"rabbitmq-server-2\" (UID: \"8150ac8e-6303-4af1-8a21-6fb434df508b\") " pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.225494 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.230040 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.230232 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0142779e-9b14-4987-927f-6abae3becbdb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb") on node "crc" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.244241 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.268114 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.286022 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.295194 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.303872 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.306976 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.307203 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.316238 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.316528 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2n6bj" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.316660 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.316957 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.317200 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.341210 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.491715 4740 generic.go:334] "Generic (PLEG): container finished" podID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerID="0e338a3ec32f130f2582ee5486c63ce39d5ce3f4ed0f744000d903c4e7162182" exitCode=0 Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.492505 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" event={"ID":"0815c1c2-7fdf-4609-8de3-34aa56424c8a","Type":"ContainerDied","Data":"0e338a3ec32f130f2582ee5486c63ce39d5ce3f4ed0f744000d903c4e7162182"} Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.495469 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" event={"ID":"0815c1c2-7fdf-4609-8de3-34aa56424c8a","Type":"ContainerStarted","Data":"f11b58f379a26c36d94116d6d69660211ada5b5611151e56a52a96ea200a00b6"} Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.496825 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.496881 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.496956 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.496986 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497099 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497222 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497366 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497457 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497633 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497740 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.497811 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mb9w\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-kube-api-access-9mb9w\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.575770 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599691 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599771 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599795 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599836 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599901 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599946 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.599979 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mb9w\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-kube-api-access-9mb9w\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.600027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.600044 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.600101 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.600127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.601756 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.602014 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.606915 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.608323 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.609954 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.611849 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.612471 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.613928 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.614018 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db121f6067a3dd90232a4eb13ff6330ceaa43b4082caf1466f839106ba50b025/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.615506 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.622617 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.625646 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mb9w\" (UniqueName: \"kubernetes.io/projected/8a70451d-dcc2-4bce-81b1-e1f6291eac3b-kube-api-access-9mb9w\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.672297 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0142779e-9b14-4987-927f-6abae3becbdb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0142779e-9b14-4987-927f-6abae3becbdb\") pod \"rabbitmq-cell1-server-0\" (UID: \"8a70451d-dcc2-4bce-81b1-e1f6291eac3b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.681536 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:13:29 crc kubenswrapper[4740]: I0105 14:13:29.780625 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.206299 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.526118 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a70451d-dcc2-4bce-81b1-e1f6291eac3b","Type":"ContainerStarted","Data":"50ea2e935265b1a182021cb1964818cb24dffd4906cab9d9674d796240088173"} Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.534949 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" event={"ID":"0815c1c2-7fdf-4609-8de3-34aa56424c8a","Type":"ContainerStarted","Data":"a5d294862620b82fcce14c9308366c990e9f636564406879547c88f2da8a6f6b"} Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.535041 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.539017 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8150ac8e-6303-4af1-8a21-6fb434df508b","Type":"ContainerStarted","Data":"f07f871470915ee55436b28cb14adb54f6a2a968d69d383d24bde2a7f0499829"} Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.540477 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerStarted","Data":"27fc7fd815303adc67ccaebffa0ef324236153e5b33162cfc181229f80a41978"} Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.582715 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" podStartSLOduration=16.582696779 podStartE2EDuration="16.582696779s" podCreationTimestamp="2026-01-05 14:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:13:30.566854963 +0000 UTC m=+1459.873763552" watchObservedRunningTime="2026-01-05 14:13:30.582696779 +0000 UTC m=+1459.889605358" Jan 05 14:13:30 crc kubenswrapper[4740]: I0105 14:13:30.983672 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c80dba-b90f-4288-a366-4ff77f76db22" path="/var/lib/kubelet/pods/54c80dba-b90f-4288-a366-4ff77f76db22/volumes" Jan 05 14:13:31 crc kubenswrapper[4740]: I0105 14:13:31.916255 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:13:31 crc kubenswrapper[4740]: I0105 14:13:31.916560 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:13:32 crc kubenswrapper[4740]: I0105 14:13:32.565178 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a70451d-dcc2-4bce-81b1-e1f6291eac3b","Type":"ContainerStarted","Data":"0d64b63284e2510df07c2d921c6c8af17cc88f482b20f5f893d00f8f23cdb286"} Jan 05 14:13:32 crc kubenswrapper[4740]: I0105 14:13:32.567980 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8150ac8e-6303-4af1-8a21-6fb434df508b","Type":"ContainerStarted","Data":"eb50d8868c675d45cc40e9c933d0816bad74b61f08a8c05c069bd821ba89044f"} Jan 05 14:13:34 crc kubenswrapper[4740]: I0105 14:13:34.618451 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerStarted","Data":"81e8ba9d6865673f9d753a4677be70994eece7af3846dfc661ceebd4675c4a19"} Jan 05 14:13:34 crc kubenswrapper[4740]: I0105 14:13:34.619291 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerStarted","Data":"a2365de9898959ea761f80c2213583d780611415109a9687afa28703d73edcff"} Jan 05 14:13:34 crc kubenswrapper[4740]: I0105 14:13:34.772209 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:34 crc kubenswrapper[4740]: I0105 14:13:34.861027 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-hhmgg"] Jan 05 14:13:34 crc kubenswrapper[4740]: I0105 14:13:34.861366 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerName="dnsmasq-dns" containerID="cri-o://83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b" gracePeriod=10 Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.012175 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-z745v"] Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.018728 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.068302 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-z745v"] Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143647 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143691 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-config\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143781 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143830 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt85z\" (UniqueName: \"kubernetes.io/projected/43c57365-103e-453b-9fd9-685adcc47850-kube-api-access-zt85z\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143878 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.143918 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.246756 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.246998 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-config\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.247022 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.247106 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.247160 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt85z\" (UniqueName: \"kubernetes.io/projected/43c57365-103e-453b-9fd9-685adcc47850-kube-api-access-zt85z\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.247213 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.247253 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.248606 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.249091 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.249636 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.249678 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-config\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.252504 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.253289 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43c57365-103e-453b-9fd9-685adcc47850-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.271254 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt85z\" (UniqueName: \"kubernetes.io/projected/43c57365-103e-453b-9fd9-685adcc47850-kube-api-access-zt85z\") pod \"dnsmasq-dns-5d75f767dc-z745v\" (UID: \"43c57365-103e-453b-9fd9-685adcc47850\") " pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.473633 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.586573 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.640163 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerStarted","Data":"8ed1086215839d6b0088e28c75eff2396e0d0aefa08484577c54423a5defcc6e"} Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.642370 4740 generic.go:334] "Generic (PLEG): container finished" podID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerID="83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b" exitCode=0 Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.642406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" event={"ID":"0fc2c7ec-e277-42bf-b04e-030649d2671a","Type":"ContainerDied","Data":"83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b"} Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.642428 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" event={"ID":"0fc2c7ec-e277-42bf-b04e-030649d2671a","Type":"ContainerDied","Data":"764bd0d6adc9d599d8e737f9476e0068155bc8d5ed155c13f8398e765d0c400c"} Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.642449 4740 scope.go:117] "RemoveContainer" containerID="83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.642665 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-hhmgg" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.658111 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dncbf\" (UniqueName: \"kubernetes.io/projected/0fc2c7ec-e277-42bf-b04e-030649d2671a-kube-api-access-dncbf\") pod \"0fc2c7ec-e277-42bf-b04e-030649d2671a\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.658165 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-svc\") pod \"0fc2c7ec-e277-42bf-b04e-030649d2671a\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.658258 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-nb\") pod \"0fc2c7ec-e277-42bf-b04e-030649d2671a\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.658324 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-sb\") pod \"0fc2c7ec-e277-42bf-b04e-030649d2671a\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.658429 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-config\") pod \"0fc2c7ec-e277-42bf-b04e-030649d2671a\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.658454 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-swift-storage-0\") pod \"0fc2c7ec-e277-42bf-b04e-030649d2671a\" (UID: \"0fc2c7ec-e277-42bf-b04e-030649d2671a\") " Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.668387 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc2c7ec-e277-42bf-b04e-030649d2671a-kube-api-access-dncbf" (OuterVolumeSpecName: "kube-api-access-dncbf") pod "0fc2c7ec-e277-42bf-b04e-030649d2671a" (UID: "0fc2c7ec-e277-42bf-b04e-030649d2671a"). InnerVolumeSpecName "kube-api-access-dncbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.680938 4740 scope.go:117] "RemoveContainer" containerID="10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.744939 4740 scope.go:117] "RemoveContainer" containerID="83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b" Jan 05 14:13:35 crc kubenswrapper[4740]: E0105 14:13:35.745503 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b\": container with ID starting with 83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b not found: ID does not exist" containerID="83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.745527 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b"} err="failed to get container status \"83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b\": rpc error: code = NotFound desc = could not find container \"83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b\": container with ID starting with 83a1b5f2eea8caf6541119ef2e0f3fbedbf86576fdaaa5c1017cadeed1ff032b not found: ID does not exist" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.745550 4740 scope.go:117] "RemoveContainer" containerID="10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586" Jan 05 14:13:35 crc kubenswrapper[4740]: E0105 14:13:35.745766 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586\": container with ID starting with 10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586 not found: ID does not exist" containerID="10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.745784 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586"} err="failed to get container status \"10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586\": rpc error: code = NotFound desc = could not find container \"10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586\": container with ID starting with 10019f97de730bbfea6f14f4e6ca558730938972a7efe8e0786de93ab7d4a586 not found: ID does not exist" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.751655 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0fc2c7ec-e277-42bf-b04e-030649d2671a" (UID: "0fc2c7ec-e277-42bf-b04e-030649d2671a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.760917 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.760942 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dncbf\" (UniqueName: \"kubernetes.io/projected/0fc2c7ec-e277-42bf-b04e-030649d2671a-kube-api-access-dncbf\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.774274 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0fc2c7ec-e277-42bf-b04e-030649d2671a" (UID: "0fc2c7ec-e277-42bf-b04e-030649d2671a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.813287 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0fc2c7ec-e277-42bf-b04e-030649d2671a" (UID: "0fc2c7ec-e277-42bf-b04e-030649d2671a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.838420 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0fc2c7ec-e277-42bf-b04e-030649d2671a" (UID: "0fc2c7ec-e277-42bf-b04e-030649d2671a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.860680 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-config" (OuterVolumeSpecName: "config") pod "0fc2c7ec-e277-42bf-b04e-030649d2671a" (UID: "0fc2c7ec-e277-42bf-b04e-030649d2671a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.863028 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.863074 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.863087 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.863099 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0fc2c7ec-e277-42bf-b04e-030649d2671a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.983845 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-hhmgg"] Jan 05 14:13:35 crc kubenswrapper[4740]: I0105 14:13:35.999361 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-hhmgg"] Jan 05 14:13:36 crc kubenswrapper[4740]: I0105 14:13:36.044608 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-z745v"] Jan 05 14:13:36 crc kubenswrapper[4740]: W0105 14:13:36.051408 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43c57365_103e_453b_9fd9_685adcc47850.slice/crio-613d336f7b1d7d539a08cbd7f92f191843896a6700527a85b281e7d32f080e03 WatchSource:0}: Error finding container 613d336f7b1d7d539a08cbd7f92f191843896a6700527a85b281e7d32f080e03: Status 404 returned error can't find the container with id 613d336f7b1d7d539a08cbd7f92f191843896a6700527a85b281e7d32f080e03 Jan 05 14:13:36 crc kubenswrapper[4740]: I0105 14:13:36.689654 4740 generic.go:334] "Generic (PLEG): container finished" podID="43c57365-103e-453b-9fd9-685adcc47850" containerID="547eabb52635067fbb550b6a645e627e9207316532ebabc24debe5f368a409be" exitCode=0 Jan 05 14:13:36 crc kubenswrapper[4740]: I0105 14:13:36.689838 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" event={"ID":"43c57365-103e-453b-9fd9-685adcc47850","Type":"ContainerDied","Data":"547eabb52635067fbb550b6a645e627e9207316532ebabc24debe5f368a409be"} Jan 05 14:13:36 crc kubenswrapper[4740]: I0105 14:13:36.689946 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" event={"ID":"43c57365-103e-453b-9fd9-685adcc47850","Type":"ContainerStarted","Data":"613d336f7b1d7d539a08cbd7f92f191843896a6700527a85b281e7d32f080e03"} Jan 05 14:13:36 crc kubenswrapper[4740]: I0105 14:13:36.982880 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" path="/var/lib/kubelet/pods/0fc2c7ec-e277-42bf-b04e-030649d2671a/volumes" Jan 05 14:13:37 crc kubenswrapper[4740]: I0105 14:13:37.705382 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" event={"ID":"43c57365-103e-453b-9fd9-685adcc47850","Type":"ContainerStarted","Data":"826b1dfdbd0e673da9e4ed1c53a66ddc98f302c2123485846c720f0790301481"} Jan 05 14:13:37 crc kubenswrapper[4740]: I0105 14:13:37.706964 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:37 crc kubenswrapper[4740]: I0105 14:13:37.709599 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerStarted","Data":"090c30f37e31df304707c354b9dca83fce087251a51bf0a249087c2a42a0e404"} Jan 05 14:13:37 crc kubenswrapper[4740]: I0105 14:13:37.710701 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 05 14:13:37 crc kubenswrapper[4740]: I0105 14:13:37.731155 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" podStartSLOduration=3.731135947 podStartE2EDuration="3.731135947s" podCreationTimestamp="2026-01-05 14:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:13:37.727279983 +0000 UTC m=+1467.034188562" watchObservedRunningTime="2026-01-05 14:13:37.731135947 +0000 UTC m=+1467.038044536" Jan 05 14:13:37 crc kubenswrapper[4740]: I0105 14:13:37.765596 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.216731737 podStartE2EDuration="9.765567213s" podCreationTimestamp="2026-01-05 14:13:28 +0000 UTC" firstStartedPulling="2026-01-05 14:13:29.569269413 +0000 UTC m=+1458.876177992" lastFinishedPulling="2026-01-05 14:13:37.118104889 +0000 UTC m=+1466.425013468" observedRunningTime="2026-01-05 14:13:37.745826012 +0000 UTC m=+1467.052734601" watchObservedRunningTime="2026-01-05 14:13:37.765567213 +0000 UTC m=+1467.072475792" Jan 05 14:13:43 crc kubenswrapper[4740]: I0105 14:13:43.789005 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-snmgh" event={"ID":"04ef5055-0453-443e-b7cc-8fd93041e899","Type":"ContainerStarted","Data":"43116dd67a5b84441d88474b594ce4b48a671c0a95f4650be9f6aab637a5b688"} Jan 05 14:13:43 crc kubenswrapper[4740]: I0105 14:13:43.811753 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-snmgh" podStartSLOduration=2.41148814 podStartE2EDuration="46.811735414s" podCreationTimestamp="2026-01-05 14:12:57 +0000 UTC" firstStartedPulling="2026-01-05 14:12:58.772041152 +0000 UTC m=+1428.078949731" lastFinishedPulling="2026-01-05 14:13:43.172288426 +0000 UTC m=+1472.479197005" observedRunningTime="2026-01-05 14:13:43.802700281 +0000 UTC m=+1473.109608900" watchObservedRunningTime="2026-01-05 14:13:43.811735414 +0000 UTC m=+1473.118644003" Jan 05 14:13:45 crc kubenswrapper[4740]: I0105 14:13:45.475256 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-z745v" Jan 05 14:13:45 crc kubenswrapper[4740]: I0105 14:13:45.567454 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-2ff28"] Jan 05 14:13:45 crc kubenswrapper[4740]: I0105 14:13:45.567734 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerName="dnsmasq-dns" containerID="cri-o://a5d294862620b82fcce14c9308366c990e9f636564406879547c88f2da8a6f6b" gracePeriod=10 Jan 05 14:13:45 crc kubenswrapper[4740]: I0105 14:13:45.817044 4740 generic.go:334] "Generic (PLEG): container finished" podID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerID="a5d294862620b82fcce14c9308366c990e9f636564406879547c88f2da8a6f6b" exitCode=0 Jan 05 14:13:45 crc kubenswrapper[4740]: I0105 14:13:45.817109 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" event={"ID":"0815c1c2-7fdf-4609-8de3-34aa56424c8a","Type":"ContainerDied","Data":"a5d294862620b82fcce14c9308366c990e9f636564406879547c88f2da8a6f6b"} Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.313864 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.336750 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvn5j\" (UniqueName: \"kubernetes.io/projected/0815c1c2-7fdf-4609-8de3-34aa56424c8a-kube-api-access-jvn5j\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.336979 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-swift-storage-0\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.337030 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-openstack-edpm-ipam\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.337123 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-svc\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.337154 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-nb\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.337724 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-sb\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.337836 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-config\") pod \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\" (UID: \"0815c1c2-7fdf-4609-8de3-34aa56424c8a\") " Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.371507 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0815c1c2-7fdf-4609-8de3-34aa56424c8a-kube-api-access-jvn5j" (OuterVolumeSpecName: "kube-api-access-jvn5j") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "kube-api-access-jvn5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.433027 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.443177 4740 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.443205 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvn5j\" (UniqueName: \"kubernetes.io/projected/0815c1c2-7fdf-4609-8de3-34aa56424c8a-kube-api-access-jvn5j\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.440289 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-config" (OuterVolumeSpecName: "config") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.457909 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.494613 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.527013 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.527980 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0815c1c2-7fdf-4609-8de3-34aa56424c8a" (UID: "0815c1c2-7fdf-4609-8de3-34aa56424c8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.545277 4740 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.545317 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.545327 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.545337 4740 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.545347 4740 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0815c1c2-7fdf-4609-8de3-34aa56424c8a-config\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.829332 4740 generic.go:334] "Generic (PLEG): container finished" podID="04ef5055-0453-443e-b7cc-8fd93041e899" containerID="43116dd67a5b84441d88474b594ce4b48a671c0a95f4650be9f6aab637a5b688" exitCode=0 Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.829410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-snmgh" event={"ID":"04ef5055-0453-443e-b7cc-8fd93041e899","Type":"ContainerDied","Data":"43116dd67a5b84441d88474b594ce4b48a671c0a95f4650be9f6aab637a5b688"} Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.832880 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" event={"ID":"0815c1c2-7fdf-4609-8de3-34aa56424c8a","Type":"ContainerDied","Data":"f11b58f379a26c36d94116d6d69660211ada5b5611151e56a52a96ea200a00b6"} Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.832923 4740 scope.go:117] "RemoveContainer" containerID="a5d294862620b82fcce14c9308366c990e9f636564406879547c88f2da8a6f6b" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.832999 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-2ff28" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.894160 4740 scope.go:117] "RemoveContainer" containerID="0e338a3ec32f130f2582ee5486c63ce39d5ce3f4ed0f744000d903c4e7162182" Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.897584 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-2ff28"] Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.911693 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-2ff28"] Jan 05 14:13:46 crc kubenswrapper[4740]: I0105 14:13:46.982989 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" path="/var/lib/kubelet/pods/0815c1c2-7fdf-4609-8de3-34aa56424c8a/volumes" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.404124 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-snmgh" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.599617 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-896mw\" (UniqueName: \"kubernetes.io/projected/04ef5055-0453-443e-b7cc-8fd93041e899-kube-api-access-896mw\") pod \"04ef5055-0453-443e-b7cc-8fd93041e899\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.599737 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-combined-ca-bundle\") pod \"04ef5055-0453-443e-b7cc-8fd93041e899\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.599937 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-config-data\") pod \"04ef5055-0453-443e-b7cc-8fd93041e899\" (UID: \"04ef5055-0453-443e-b7cc-8fd93041e899\") " Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.608429 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ef5055-0453-443e-b7cc-8fd93041e899-kube-api-access-896mw" (OuterVolumeSpecName: "kube-api-access-896mw") pod "04ef5055-0453-443e-b7cc-8fd93041e899" (UID: "04ef5055-0453-443e-b7cc-8fd93041e899"). InnerVolumeSpecName "kube-api-access-896mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.646949 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ef5055-0453-443e-b7cc-8fd93041e899" (UID: "04ef5055-0453-443e-b7cc-8fd93041e899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.697928 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-config-data" (OuterVolumeSpecName: "config-data") pod "04ef5055-0453-443e-b7cc-8fd93041e899" (UID: "04ef5055-0453-443e-b7cc-8fd93041e899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.707229 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.707275 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-896mw\" (UniqueName: \"kubernetes.io/projected/04ef5055-0453-443e-b7cc-8fd93041e899-kube-api-access-896mw\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.707292 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ef5055-0453-443e-b7cc-8fd93041e899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.873173 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-snmgh" event={"ID":"04ef5055-0453-443e-b7cc-8fd93041e899","Type":"ContainerDied","Data":"af84f7bdb27e754daa46359e0b647d0080b538dff4c95d437ac72bddfb2187f1"} Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.873527 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af84f7bdb27e754daa46359e0b647d0080b538dff4c95d437ac72bddfb2187f1" Jan 05 14:13:48 crc kubenswrapper[4740]: I0105 14:13:48.873238 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-snmgh" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.848501 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6ff48d7c64-5zmkg"] Jan 05 14:13:49 crc kubenswrapper[4740]: E0105 14:13:49.851207 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerName="init" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851242 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerName="init" Jan 05 14:13:49 crc kubenswrapper[4740]: E0105 14:13:49.851264 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerName="dnsmasq-dns" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851272 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerName="dnsmasq-dns" Jan 05 14:13:49 crc kubenswrapper[4740]: E0105 14:13:49.851312 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ef5055-0453-443e-b7cc-8fd93041e899" containerName="heat-db-sync" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851323 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ef5055-0453-443e-b7cc-8fd93041e899" containerName="heat-db-sync" Jan 05 14:13:49 crc kubenswrapper[4740]: E0105 14:13:49.851362 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerName="init" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851370 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerName="init" Jan 05 14:13:49 crc kubenswrapper[4740]: E0105 14:13:49.851385 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerName="dnsmasq-dns" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851391 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerName="dnsmasq-dns" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851756 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ef5055-0453-443e-b7cc-8fd93041e899" containerName="heat-db-sync" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851797 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0815c1c2-7fdf-4609-8de3-34aa56424c8a" containerName="dnsmasq-dns" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.851825 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc2c7ec-e277-42bf-b04e-030649d2671a" containerName="dnsmasq-dns" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.857013 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.877375 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6ff48d7c64-5zmkg"] Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.955133 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5996fc5cd9-vd2l5"] Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.957195 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.970517 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5cd669fcc8-pkmkd"] Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.972512 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:49 crc kubenswrapper[4740]: I0105 14:13:49.982848 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5996fc5cd9-vd2l5"] Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.046560 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdmvt\" (UniqueName: \"kubernetes.io/projected/efa1a5de-d25c-4be0-b921-2cd12446d1dc-kube-api-access-qdmvt\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.046634 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-combined-ca-bundle\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.048733 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-config-data\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.048798 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-config-data-custom\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.088117 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5cd669fcc8-pkmkd"] Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.151520 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-combined-ca-bundle\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.151586 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nw9\" (UniqueName: \"kubernetes.io/projected/e19ab652-e210-4428-84aa-8f04d156a4fb-kube-api-access-59nw9\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.151977 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgnkl\" (UniqueName: \"kubernetes.io/projected/5598f372-3948-4c7b-8f77-f8305802d248-kube-api-access-dgnkl\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152044 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-combined-ca-bundle\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152113 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-public-tls-certs\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152138 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-config-data\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152200 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-config-data\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152223 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-config-data-custom\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152256 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-internal-tls-certs\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152316 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-internal-tls-certs\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152723 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-public-tls-certs\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152767 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-config-data\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152796 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdmvt\" (UniqueName: \"kubernetes.io/projected/efa1a5de-d25c-4be0-b921-2cd12446d1dc-kube-api-access-qdmvt\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152829 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-config-data-custom\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152847 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-combined-ca-bundle\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.152862 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-config-data-custom\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.157728 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-config-data\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.173821 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-config-data-custom\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.174395 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa1a5de-d25c-4be0-b921-2cd12446d1dc-combined-ca-bundle\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.175845 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdmvt\" (UniqueName: \"kubernetes.io/projected/efa1a5de-d25c-4be0-b921-2cd12446d1dc-kube-api-access-qdmvt\") pod \"heat-engine-6ff48d7c64-5zmkg\" (UID: \"efa1a5de-d25c-4be0-b921-2cd12446d1dc\") " pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.180767 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255033 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-internal-tls-certs\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255314 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-internal-tls-certs\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255354 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-public-tls-certs\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255374 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-config-data\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255418 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-config-data-custom\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255443 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-combined-ca-bundle\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255464 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-config-data-custom\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nw9\" (UniqueName: \"kubernetes.io/projected/e19ab652-e210-4428-84aa-8f04d156a4fb-kube-api-access-59nw9\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255545 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgnkl\" (UniqueName: \"kubernetes.io/projected/5598f372-3948-4c7b-8f77-f8305802d248-kube-api-access-dgnkl\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255574 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-combined-ca-bundle\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255603 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-public-tls-certs\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.255624 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-config-data\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.263529 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-combined-ca-bundle\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.267648 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-public-tls-certs\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.269347 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-config-data\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.279298 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-internal-tls-certs\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.281811 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-config-data\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.282382 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-internal-tls-certs\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.289787 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-public-tls-certs\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.290436 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-config-data-custom\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.290675 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e19ab652-e210-4428-84aa-8f04d156a4fb-config-data-custom\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.290839 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nw9\" (UniqueName: \"kubernetes.io/projected/e19ab652-e210-4428-84aa-8f04d156a4fb-kube-api-access-59nw9\") pod \"heat-cfnapi-5cd669fcc8-pkmkd\" (UID: \"e19ab652-e210-4428-84aa-8f04d156a4fb\") " pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.301846 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5598f372-3948-4c7b-8f77-f8305802d248-combined-ca-bundle\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.306592 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgnkl\" (UniqueName: \"kubernetes.io/projected/5598f372-3948-4c7b-8f77-f8305802d248-kube-api-access-dgnkl\") pod \"heat-api-5996fc5cd9-vd2l5\" (UID: \"5598f372-3948-4c7b-8f77-f8305802d248\") " pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.307284 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.573901 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:50 crc kubenswrapper[4740]: W0105 14:13:50.800545 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefa1a5de_d25c_4be0_b921_2cd12446d1dc.slice/crio-84b02d13c47b04078ab9523d879f6408fbadc1552839933fcd85d346cf7dff48 WatchSource:0}: Error finding container 84b02d13c47b04078ab9523d879f6408fbadc1552839933fcd85d346cf7dff48: Status 404 returned error can't find the container with id 84b02d13c47b04078ab9523d879f6408fbadc1552839933fcd85d346cf7dff48 Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.801409 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6ff48d7c64-5zmkg"] Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.915374 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6ff48d7c64-5zmkg" event={"ID":"efa1a5de-d25c-4be0-b921-2cd12446d1dc","Type":"ContainerStarted","Data":"84b02d13c47b04078ab9523d879f6408fbadc1552839933fcd85d346cf7dff48"} Jan 05 14:13:50 crc kubenswrapper[4740]: W0105 14:13:50.974339 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19ab652_e210_4428_84aa_8f04d156a4fb.slice/crio-a6b7b9d673d0807c1304f255ab454c50ed0aea8d8348d218b5f35dbe9e87651d WatchSource:0}: Error finding container a6b7b9d673d0807c1304f255ab454c50ed0aea8d8348d218b5f35dbe9e87651d: Status 404 returned error can't find the container with id a6b7b9d673d0807c1304f255ab454c50ed0aea8d8348d218b5f35dbe9e87651d Jan 05 14:13:50 crc kubenswrapper[4740]: I0105 14:13:50.988870 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5cd669fcc8-pkmkd"] Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.203858 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5996fc5cd9-vd2l5"] Jan 05 14:13:51 crc kubenswrapper[4740]: W0105 14:13:51.213730 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5598f372_3948_4c7b_8f77_f8305802d248.slice/crio-1368bd7ef6a26fb0f332767c69841099f845dcc84fd82a3d8df2285218004eb7 WatchSource:0}: Error finding container 1368bd7ef6a26fb0f332767c69841099f845dcc84fd82a3d8df2285218004eb7: Status 404 returned error can't find the container with id 1368bd7ef6a26fb0f332767c69841099f845dcc84fd82a3d8df2285218004eb7 Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.510579 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mh68f"] Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.513031 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.545394 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mh68f"] Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.595338 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4czdh\" (UniqueName: \"kubernetes.io/projected/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-kube-api-access-4czdh\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.595431 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-utilities\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.595520 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-catalog-content\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.697884 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-utilities\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.698014 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-catalog-content\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.698270 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4czdh\" (UniqueName: \"kubernetes.io/projected/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-kube-api-access-4czdh\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.698394 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-utilities\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.698537 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-catalog-content\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.720465 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4czdh\" (UniqueName: \"kubernetes.io/projected/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-kube-api-access-4czdh\") pod \"community-operators-mh68f\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.842480 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.963535 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" event={"ID":"e19ab652-e210-4428-84aa-8f04d156a4fb","Type":"ContainerStarted","Data":"a6b7b9d673d0807c1304f255ab454c50ed0aea8d8348d218b5f35dbe9e87651d"} Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.968427 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5996fc5cd9-vd2l5" event={"ID":"5598f372-3948-4c7b-8f77-f8305802d248","Type":"ContainerStarted","Data":"1368bd7ef6a26fb0f332767c69841099f845dcc84fd82a3d8df2285218004eb7"} Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.974720 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6ff48d7c64-5zmkg" event={"ID":"efa1a5de-d25c-4be0-b921-2cd12446d1dc","Type":"ContainerStarted","Data":"9e6ef703a53ab7c24c33ff01c1100b4231b68f983cbce159eb759ab44bf429df"} Jan 05 14:13:51 crc kubenswrapper[4740]: I0105 14:13:51.976245 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:13:52 crc kubenswrapper[4740]: I0105 14:13:52.005572 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6ff48d7c64-5zmkg" podStartSLOduration=3.005556047 podStartE2EDuration="3.005556047s" podCreationTimestamp="2026-01-05 14:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:13:51.989293979 +0000 UTC m=+1481.296202558" watchObservedRunningTime="2026-01-05 14:13:52.005556047 +0000 UTC m=+1481.312464636" Jan 05 14:13:53 crc kubenswrapper[4740]: I0105 14:13:53.424292 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mh68f"] Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.046393 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" event={"ID":"e19ab652-e210-4428-84aa-8f04d156a4fb","Type":"ContainerStarted","Data":"84361136e9eafcbb87a120c4f61f7ec9346d4e9f35e3512cf0daf871ecd4b0eb"} Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.046992 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.050625 4740 generic.go:334] "Generic (PLEG): container finished" podID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerID="5157bca55b6171d4c2d69378642d9f0bda9897da8a571730702c86d93823b1e3" exitCode=0 Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.050915 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerDied","Data":"5157bca55b6171d4c2d69378642d9f0bda9897da8a571730702c86d93823b1e3"} Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.050963 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerStarted","Data":"c28bf85ebf634f07efa58a6bf38e5f0000c860831927adebc1e09326545a69ef"} Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.054368 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5996fc5cd9-vd2l5" event={"ID":"5598f372-3948-4c7b-8f77-f8305802d248","Type":"ContainerStarted","Data":"8bd7839105ccc25044ca90a4ad3024402b3c77117c0fb2c3db6bed82a49d7ba6"} Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.054578 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.093685 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" podStartSLOduration=3.276418602 podStartE2EDuration="5.093665607s" podCreationTimestamp="2026-01-05 14:13:49 +0000 UTC" firstStartedPulling="2026-01-05 14:13:50.976703106 +0000 UTC m=+1480.283611685" lastFinishedPulling="2026-01-05 14:13:52.793950111 +0000 UTC m=+1482.100858690" observedRunningTime="2026-01-05 14:13:54.071233024 +0000 UTC m=+1483.378141643" watchObservedRunningTime="2026-01-05 14:13:54.093665607 +0000 UTC m=+1483.400574186" Jan 05 14:13:54 crc kubenswrapper[4740]: I0105 14:13:54.140054 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5996fc5cd9-vd2l5" podStartSLOduration=3.552881926 podStartE2EDuration="5.140037363s" podCreationTimestamp="2026-01-05 14:13:49 +0000 UTC" firstStartedPulling="2026-01-05 14:13:51.216118695 +0000 UTC m=+1480.523027274" lastFinishedPulling="2026-01-05 14:13:52.803274132 +0000 UTC m=+1482.110182711" observedRunningTime="2026-01-05 14:13:54.127147957 +0000 UTC m=+1483.434056546" watchObservedRunningTime="2026-01-05 14:13:54.140037363 +0000 UTC m=+1483.446945942" Jan 05 14:13:55 crc kubenswrapper[4740]: I0105 14:13:55.067230 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerStarted","Data":"1abf241e5ab43fb35d5992dfaf9d5a59f2583f79c13154922cb87e9d5753eb60"} Jan 05 14:13:57 crc kubenswrapper[4740]: I0105 14:13:57.132665 4740 generic.go:334] "Generic (PLEG): container finished" podID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerID="1abf241e5ab43fb35d5992dfaf9d5a59f2583f79c13154922cb87e9d5753eb60" exitCode=0 Jan 05 14:13:57 crc kubenswrapper[4740]: I0105 14:13:57.132760 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerDied","Data":"1abf241e5ab43fb35d5992dfaf9d5a59f2583f79c13154922cb87e9d5753eb60"} Jan 05 14:13:57 crc kubenswrapper[4740]: I0105 14:13:57.326862 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dc9693bc-68c2-4ba7-bd23-95303bf357d4" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.13:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 14:13:58 crc kubenswrapper[4740]: I0105 14:13:58.145657 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerStarted","Data":"a88fe9e49b6ac1ab3cc8f1ee049e536ff3daaa24490fdde9c767c7ac6fd56bd5"} Jan 05 14:13:58 crc kubenswrapper[4740]: I0105 14:13:58.185529 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mh68f" podStartSLOduration=3.530207587 podStartE2EDuration="7.185508524s" podCreationTimestamp="2026-01-05 14:13:51 +0000 UTC" firstStartedPulling="2026-01-05 14:13:54.053157927 +0000 UTC m=+1483.360066506" lastFinishedPulling="2026-01-05 14:13:57.708458864 +0000 UTC m=+1487.015367443" observedRunningTime="2026-01-05 14:13:58.181060105 +0000 UTC m=+1487.487968694" watchObservedRunningTime="2026-01-05 14:13:58.185508524 +0000 UTC m=+1487.492417103" Jan 05 14:13:58 crc kubenswrapper[4740]: I0105 14:13:58.988179 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.794764 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-96dx9"] Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.797731 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.814808 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96dx9"] Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.842898 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.842936 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.892711 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-utilities\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.893231 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28jls\" (UniqueName: \"kubernetes.io/projected/528a996a-b805-4638-a027-d52d8733d109-kube-api-access-28jls\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.893361 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-catalog-content\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.916368 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.916434 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.916476 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.917427 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcada73fec747c8bb22d39df02fd140c2ecd4b0b0dc04e0085ad7f13dab4ab07"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.917498 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://bcada73fec747c8bb22d39df02fd140c2ecd4b0b0dc04e0085ad7f13dab4ab07" gracePeriod=600 Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.970830 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5cd669fcc8-pkmkd" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.994977 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-catalog-content\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.995127 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-utilities\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.995276 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28jls\" (UniqueName: \"kubernetes.io/projected/528a996a-b805-4638-a027-d52d8733d109-kube-api-access-28jls\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.995366 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-catalog-content\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:01 crc kubenswrapper[4740]: I0105 14:14:01.995608 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-utilities\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.020342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28jls\" (UniqueName: \"kubernetes.io/projected/528a996a-b805-4638-a027-d52d8733d109-kube-api-access-28jls\") pod \"redhat-marketplace-96dx9\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.052162 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-75bf4fb87-2g4f8"] Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.052669 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" podUID="9307c633-b44c-4f4d-8414-9f28826c30dc" containerName="heat-cfnapi" containerID="cri-o://302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5" gracePeriod=60 Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.118530 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.234355 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="bcada73fec747c8bb22d39df02fd140c2ecd4b0b0dc04e0085ad7f13dab4ab07" exitCode=0 Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.234408 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"bcada73fec747c8bb22d39df02fd140c2ecd4b0b0dc04e0085ad7f13dab4ab07"} Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.234442 4740 scope.go:117] "RemoveContainer" containerID="7164e8ec74a1f47d6179acf6f6c20f4c18a05f4cce20c982561062888c48c311" Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.510483 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5996fc5cd9-vd2l5" Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.588756 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8669649c9d-h45rr"] Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.589239 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-8669649c9d-h45rr" podUID="116ff769-c12d-42af-93a9-549001762acd" containerName="heat-api" containerID="cri-o://2fc4036255b3f20c57469e0a421b0befb18910d0ca48c27c57bdb0b94e427c6d" gracePeriod=60 Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.740508 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96dx9"] Jan 05 14:14:02 crc kubenswrapper[4740]: I0105 14:14:02.905346 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mh68f" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="registry-server" probeResult="failure" output=< Jan 05 14:14:02 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:14:02 crc kubenswrapper[4740]: > Jan 05 14:14:03 crc kubenswrapper[4740]: I0105 14:14:03.266971 4740 generic.go:334] "Generic (PLEG): container finished" podID="528a996a-b805-4638-a027-d52d8733d109" containerID="cfd9d689a483a8848647ca99eb34c99f4682ad2f5efd05f6a59f94c08c64951b" exitCode=0 Jan 05 14:14:03 crc kubenswrapper[4740]: I0105 14:14:03.267274 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerDied","Data":"cfd9d689a483a8848647ca99eb34c99f4682ad2f5efd05f6a59f94c08c64951b"} Jan 05 14:14:03 crc kubenswrapper[4740]: I0105 14:14:03.267303 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerStarted","Data":"43fb26800aed6b188f6ec937b20150373d5bb55b6ec0040527fce3ff30417561"} Jan 05 14:14:03 crc kubenswrapper[4740]: I0105 14:14:03.272885 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5"} Jan 05 14:14:04 crc kubenswrapper[4740]: I0105 14:14:04.297541 4740 generic.go:334] "Generic (PLEG): container finished" podID="8150ac8e-6303-4af1-8a21-6fb434df508b" containerID="eb50d8868c675d45cc40e9c933d0816bad74b61f08a8c05c069bd821ba89044f" exitCode=0 Jan 05 14:14:04 crc kubenswrapper[4740]: I0105 14:14:04.297849 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8150ac8e-6303-4af1-8a21-6fb434df508b","Type":"ContainerDied","Data":"eb50d8868c675d45cc40e9c933d0816bad74b61f08a8c05c069bd821ba89044f"} Jan 05 14:14:04 crc kubenswrapper[4740]: I0105 14:14:04.308212 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerStarted","Data":"a83f6672ec1df258da11a1e585d7447a75a7b5d235fb12bb2aa437b896b67da8"} Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.322012 4740 generic.go:334] "Generic (PLEG): container finished" podID="8a70451d-dcc2-4bce-81b1-e1f6291eac3b" containerID="0d64b63284e2510df07c2d921c6c8af17cc88f482b20f5f893d00f8f23cdb286" exitCode=0 Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.322146 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a70451d-dcc2-4bce-81b1-e1f6291eac3b","Type":"ContainerDied","Data":"0d64b63284e2510df07c2d921c6c8af17cc88f482b20f5f893d00f8f23cdb286"} Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.330319 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8150ac8e-6303-4af1-8a21-6fb434df508b","Type":"ContainerStarted","Data":"9f245c2b2d1c3ba16e9c0b42cc4dfe21a843c1f80594433c45128a49ac395645"} Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.331150 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.344056 4740 generic.go:334] "Generic (PLEG): container finished" podID="528a996a-b805-4638-a027-d52d8733d109" containerID="a83f6672ec1df258da11a1e585d7447a75a7b5d235fb12bb2aa437b896b67da8" exitCode=0 Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.344130 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerDied","Data":"a83f6672ec1df258da11a1e585d7447a75a7b5d235fb12bb2aa437b896b67da8"} Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.397349 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=37.397319256 podStartE2EDuration="37.397319256s" podCreationTimestamp="2026-01-05 14:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:14:05.382011994 +0000 UTC m=+1494.688920573" watchObservedRunningTime="2026-01-05 14:14:05.397319256 +0000 UTC m=+1494.704227835" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.542422 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" podUID="9307c633-b44c-4f4d-8414-9f28826c30dc" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.230:8000/healthcheck\": read tcp 10.217.0.2:60032->10.217.0.230:8000: read: connection reset by peer" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.888379 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts"] Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.892320 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.900653 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.901493 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.901887 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.902677 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:14:05 crc kubenswrapper[4740]: I0105 14:14:05.923133 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts"] Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.047675 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.047760 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.048199 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9jh8\" (UniqueName: \"kubernetes.io/projected/d20a172f-298d-47ac-a858-cd1600b65c4e-kube-api-access-p9jh8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.048806 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.085691 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-8669649c9d-h45rr" podUID="116ff769-c12d-42af-93a9-549001762acd" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.231:8004/healthcheck\": read tcp 10.217.0.2:48398->10.217.0.231:8004: read: connection reset by peer" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.151834 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9jh8\" (UniqueName: \"kubernetes.io/projected/d20a172f-298d-47ac-a858-cd1600b65c4e-kube-api-access-p9jh8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.152163 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.152255 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.152324 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.158651 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.160416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.174554 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.187914 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.205574 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9jh8\" (UniqueName: \"kubernetes.io/projected/d20a172f-298d-47ac-a858-cd1600b65c4e-kube-api-access-p9jh8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-flmts\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.254739 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.358999 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-public-tls-certs\") pod \"9307c633-b44c-4f4d-8414-9f28826c30dc\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.359494 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data-custom\") pod \"9307c633-b44c-4f4d-8414-9f28826c30dc\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.359538 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data\") pod \"9307c633-b44c-4f4d-8414-9f28826c30dc\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.360940 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw66z\" (UniqueName: \"kubernetes.io/projected/9307c633-b44c-4f4d-8414-9f28826c30dc-kube-api-access-kw66z\") pod \"9307c633-b44c-4f4d-8414-9f28826c30dc\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.360979 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-combined-ca-bundle\") pod \"9307c633-b44c-4f4d-8414-9f28826c30dc\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.361009 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-internal-tls-certs\") pod \"9307c633-b44c-4f4d-8414-9f28826c30dc\" (UID: \"9307c633-b44c-4f4d-8414-9f28826c30dc\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.375830 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerStarted","Data":"553a264f5686d1e8fadffb83844ea2b91955b19fb158568138c80a7813c60197"} Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.393873 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9307c633-b44c-4f4d-8414-9f28826c30dc-kube-api-access-kw66z" (OuterVolumeSpecName: "kube-api-access-kw66z") pod "9307c633-b44c-4f4d-8414-9f28826c30dc" (UID: "9307c633-b44c-4f4d-8414-9f28826c30dc"). InnerVolumeSpecName "kube-api-access-kw66z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.393950 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9307c633-b44c-4f4d-8414-9f28826c30dc" (UID: "9307c633-b44c-4f4d-8414-9f28826c30dc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.409547 4740 generic.go:334] "Generic (PLEG): container finished" podID="9307c633-b44c-4f4d-8414-9f28826c30dc" containerID="302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5" exitCode=0 Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.409631 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" event={"ID":"9307c633-b44c-4f4d-8414-9f28826c30dc","Type":"ContainerDied","Data":"302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5"} Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.409664 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" event={"ID":"9307c633-b44c-4f4d-8414-9f28826c30dc","Type":"ContainerDied","Data":"74f26ad623f47a8a4ec6e4de16fe4ab04b71dfc46b57b0a134c5c29d5055c1c2"} Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.409685 4740 scope.go:117] "RemoveContainer" containerID="302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.409837 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-75bf4fb87-2g4f8" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.427442 4740 generic.go:334] "Generic (PLEG): container finished" podID="116ff769-c12d-42af-93a9-549001762acd" containerID="2fc4036255b3f20c57469e0a421b0befb18910d0ca48c27c57bdb0b94e427c6d" exitCode=0 Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.427685 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8669649c9d-h45rr" event={"ID":"116ff769-c12d-42af-93a9-549001762acd","Type":"ContainerDied","Data":"2fc4036255b3f20c57469e0a421b0befb18910d0ca48c27c57bdb0b94e427c6d"} Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.470406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8a70451d-dcc2-4bce-81b1-e1f6291eac3b","Type":"ContainerStarted","Data":"e5f88cb7826afba8f611fc01b67efa02b42107d9d0e10aea15abd3bd5e8cf6d3"} Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.472294 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.472831 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-96dx9" podStartSLOduration=2.792843364 podStartE2EDuration="5.472808822s" podCreationTimestamp="2026-01-05 14:14:01 +0000 UTC" firstStartedPulling="2026-01-05 14:14:03.271607285 +0000 UTC m=+1492.578515864" lastFinishedPulling="2026-01-05 14:14:05.951572743 +0000 UTC m=+1495.258481322" observedRunningTime="2026-01-05 14:14:06.419161038 +0000 UTC m=+1495.726069617" watchObservedRunningTime="2026-01-05 14:14:06.472808822 +0000 UTC m=+1495.779717401" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.523962 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.523942067 podStartE2EDuration="37.523942067s" podCreationTimestamp="2026-01-05 14:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:14:06.509463627 +0000 UTC m=+1495.816372226" watchObservedRunningTime="2026-01-05 14:14:06.523942067 +0000 UTC m=+1495.830850646" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.530328 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.530744 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw66z\" (UniqueName: \"kubernetes.io/projected/9307c633-b44c-4f4d-8414-9f28826c30dc-kube-api-access-kw66z\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.538393 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9307c633-b44c-4f4d-8414-9f28826c30dc" (UID: "9307c633-b44c-4f4d-8414-9f28826c30dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.583436 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data" (OuterVolumeSpecName: "config-data") pod "9307c633-b44c-4f4d-8414-9f28826c30dc" (UID: "9307c633-b44c-4f4d-8414-9f28826c30dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.632550 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9307c633-b44c-4f4d-8414-9f28826c30dc" (UID: "9307c633-b44c-4f4d-8414-9f28826c30dc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.642324 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.642356 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.642365 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.727578 4740 scope.go:117] "RemoveContainer" containerID="302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5" Jan 05 14:14:06 crc kubenswrapper[4740]: E0105 14:14:06.728955 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5\": container with ID starting with 302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5 not found: ID does not exist" containerID="302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.728994 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5"} err="failed to get container status \"302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5\": rpc error: code = NotFound desc = could not find container \"302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5\": container with ID starting with 302a005d7d2596d5fa6e342ea5a9360993a4093c179ca42a5fcdbb899ba2fee5 not found: ID does not exist" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.757253 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9307c633-b44c-4f4d-8414-9f28826c30dc" (UID: "9307c633-b44c-4f4d-8414-9f28826c30dc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.769312 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.775443 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9307c633-b44c-4f4d-8414-9f28826c30dc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.876833 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data-custom\") pod \"116ff769-c12d-42af-93a9-549001762acd\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.876982 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-internal-tls-certs\") pod \"116ff769-c12d-42af-93a9-549001762acd\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.877034 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-public-tls-certs\") pod \"116ff769-c12d-42af-93a9-549001762acd\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.877100 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-combined-ca-bundle\") pod \"116ff769-c12d-42af-93a9-549001762acd\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.877168 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data\") pod \"116ff769-c12d-42af-93a9-549001762acd\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.877251 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlfzb\" (UniqueName: \"kubernetes.io/projected/116ff769-c12d-42af-93a9-549001762acd-kube-api-access-jlfzb\") pod \"116ff769-c12d-42af-93a9-549001762acd\" (UID: \"116ff769-c12d-42af-93a9-549001762acd\") " Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.885271 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116ff769-c12d-42af-93a9-549001762acd-kube-api-access-jlfzb" (OuterVolumeSpecName: "kube-api-access-jlfzb") pod "116ff769-c12d-42af-93a9-549001762acd" (UID: "116ff769-c12d-42af-93a9-549001762acd"). InnerVolumeSpecName "kube-api-access-jlfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.895925 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "116ff769-c12d-42af-93a9-549001762acd" (UID: "116ff769-c12d-42af-93a9-549001762acd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.990271 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:06 crc kubenswrapper[4740]: I0105 14:14:06.990291 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlfzb\" (UniqueName: \"kubernetes.io/projected/116ff769-c12d-42af-93a9-549001762acd-kube-api-access-jlfzb\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.045987 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "116ff769-c12d-42af-93a9-549001762acd" (UID: "116ff769-c12d-42af-93a9-549001762acd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.055127 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "116ff769-c12d-42af-93a9-549001762acd" (UID: "116ff769-c12d-42af-93a9-549001762acd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.097586 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.097620 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.097644 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-75bf4fb87-2g4f8"] Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.114879 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-75bf4fb87-2g4f8"] Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.135319 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "116ff769-c12d-42af-93a9-549001762acd" (UID: "116ff769-c12d-42af-93a9-549001762acd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.144814 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data" (OuterVolumeSpecName: "config-data") pod "116ff769-c12d-42af-93a9-549001762acd" (UID: "116ff769-c12d-42af-93a9-549001762acd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.200328 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.200571 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/116ff769-c12d-42af-93a9-549001762acd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.332230 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts"] Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.493576 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" event={"ID":"d20a172f-298d-47ac-a858-cd1600b65c4e","Type":"ContainerStarted","Data":"085747bf90c9c3e22c69b0c7fdc79552bdf9ce7b643e02c373b6cb11747a393a"} Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.499831 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8669649c9d-h45rr" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.501175 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8669649c9d-h45rr" event={"ID":"116ff769-c12d-42af-93a9-549001762acd","Type":"ContainerDied","Data":"947af3699a9aaa62c31e1417fbd983e9c7c29c1da4aa8e8da89101bd7d06e3ec"} Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.501214 4740 scope.go:117] "RemoveContainer" containerID="2fc4036255b3f20c57469e0a421b0befb18910d0ca48c27c57bdb0b94e427c6d" Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.557494 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8669649c9d-h45rr"] Jan 05 14:14:07 crc kubenswrapper[4740]: I0105 14:14:07.573519 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-8669649c9d-h45rr"] Jan 05 14:14:08 crc kubenswrapper[4740]: I0105 14:14:08.995347 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116ff769-c12d-42af-93a9-549001762acd" path="/var/lib/kubelet/pods/116ff769-c12d-42af-93a9-549001762acd/volumes" Jan 05 14:14:08 crc kubenswrapper[4740]: I0105 14:14:08.998716 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9307c633-b44c-4f4d-8414-9f28826c30dc" path="/var/lib/kubelet/pods/9307c633-b44c-4f4d-8414-9f28826c30dc/volumes" Jan 05 14:14:10 crc kubenswrapper[4740]: I0105 14:14:10.232648 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6ff48d7c64-5zmkg" Jan 05 14:14:10 crc kubenswrapper[4740]: I0105 14:14:10.292782 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-d7697fc9-j6p2f"] Jan 05 14:14:10 crc kubenswrapper[4740]: I0105 14:14:10.292992 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-d7697fc9-j6p2f" podUID="42341a11-18a3-4ef1-8e12-a9450f68370e" containerName="heat-engine" containerID="cri-o://0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374" gracePeriod=60 Jan 05 14:14:11 crc kubenswrapper[4740]: I0105 14:14:11.908635 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:14:11 crc kubenswrapper[4740]: I0105 14:14:11.976627 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:14:12 crc kubenswrapper[4740]: I0105 14:14:12.120136 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:12 crc kubenswrapper[4740]: I0105 14:14:12.120453 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:12 crc kubenswrapper[4740]: I0105 14:14:12.167464 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mh68f"] Jan 05 14:14:12 crc kubenswrapper[4740]: I0105 14:14:12.186683 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:12 crc kubenswrapper[4740]: I0105 14:14:12.642253 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:13 crc kubenswrapper[4740]: I0105 14:14:13.597364 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mh68f" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="registry-server" containerID="cri-o://a88fe9e49b6ac1ab3cc8f1ee049e536ff3daaa24490fdde9c767c7ac6fd56bd5" gracePeriod=2 Jan 05 14:14:14 crc kubenswrapper[4740]: I0105 14:14:14.556651 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96dx9"] Jan 05 14:14:14 crc kubenswrapper[4740]: I0105 14:14:14.612038 4740 generic.go:334] "Generic (PLEG): container finished" podID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerID="a88fe9e49b6ac1ab3cc8f1ee049e536ff3daaa24490fdde9c767c7ac6fd56bd5" exitCode=0 Jan 05 14:14:14 crc kubenswrapper[4740]: I0105 14:14:14.612098 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerDied","Data":"a88fe9e49b6ac1ab3cc8f1ee049e536ff3daaa24490fdde9c767c7ac6fd56bd5"} Jan 05 14:14:14 crc kubenswrapper[4740]: I0105 14:14:14.612324 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-96dx9" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="registry-server" containerID="cri-o://553a264f5686d1e8fadffb83844ea2b91955b19fb158568138c80a7813c60197" gracePeriod=2 Jan 05 14:14:15 crc kubenswrapper[4740]: I0105 14:14:15.627458 4740 generic.go:334] "Generic (PLEG): container finished" podID="528a996a-b805-4638-a027-d52d8733d109" containerID="553a264f5686d1e8fadffb83844ea2b91955b19fb158568138c80a7813c60197" exitCode=0 Jan 05 14:14:15 crc kubenswrapper[4740]: I0105 14:14:15.627544 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerDied","Data":"553a264f5686d1e8fadffb83844ea2b91955b19fb158568138c80a7813c60197"} Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.160033 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-lwqt9"] Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.172454 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-lwqt9"] Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.440597 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-whbn9"] Jan 05 14:14:18 crc kubenswrapper[4740]: E0105 14:14:18.441504 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9307c633-b44c-4f4d-8414-9f28826c30dc" containerName="heat-cfnapi" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.441531 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9307c633-b44c-4f4d-8414-9f28826c30dc" containerName="heat-cfnapi" Jan 05 14:14:18 crc kubenswrapper[4740]: E0105 14:14:18.441620 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116ff769-c12d-42af-93a9-549001762acd" containerName="heat-api" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.441636 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="116ff769-c12d-42af-93a9-549001762acd" containerName="heat-api" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.441909 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="116ff769-c12d-42af-93a9-549001762acd" containerName="heat-api" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.441944 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9307c633-b44c-4f4d-8414-9f28826c30dc" containerName="heat-cfnapi" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.443230 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.445970 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.457723 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-whbn9"] Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.539800 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-config-data\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.540472 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-combined-ca-bundle\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.540671 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4vjz\" (UniqueName: \"kubernetes.io/projected/728565b1-c651-4b9e-a279-75ecf0e4eeb2-kube-api-access-p4vjz\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.540938 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-scripts\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.643233 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4vjz\" (UniqueName: \"kubernetes.io/projected/728565b1-c651-4b9e-a279-75ecf0e4eeb2-kube-api-access-p4vjz\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.643752 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-scripts\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.644921 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-config-data\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.645550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-combined-ca-bundle\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.651478 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-config-data\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.668320 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-scripts\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.668804 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-combined-ca-bundle\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.669629 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4vjz\" (UniqueName: \"kubernetes.io/projected/728565b1-c651-4b9e-a279-75ecf0e4eeb2-kube-api-access-p4vjz\") pod \"aodh-db-sync-whbn9\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.773321 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:18 crc kubenswrapper[4740]: I0105 14:14:18.983793 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9724d8fc-2e73-46f4-b3ae-357a4c3e8313" path="/var/lib/kubelet/pods/9724d8fc-2e73-46f4-b3ae-357a4c3e8313/volumes" Jan 05 14:14:19 crc kubenswrapper[4740]: I0105 14:14:19.227923 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="8150ac8e-6303-4af1-8a21-6fb434df508b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.17:5671: connect: connection refused" Jan 05 14:14:19 crc kubenswrapper[4740]: I0105 14:14:19.635091 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:14:19 crc kubenswrapper[4740]: I0105 14:14:19.706563 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 05 14:14:19 crc kubenswrapper[4740]: E0105 14:14:19.983352 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 05 14:14:19 crc kubenswrapper[4740]: E0105 14:14:19.986033 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 05 14:14:19 crc kubenswrapper[4740]: E0105 14:14:19.987225 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Jan 05 14:14:19 crc kubenswrapper[4740]: E0105 14:14:19.987273 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-d7697fc9-j6p2f" podUID="42341a11-18a3-4ef1-8e12-a9450f68370e" containerName="heat-engine" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.107412 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.156427 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.178127 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4czdh\" (UniqueName: \"kubernetes.io/projected/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-kube-api-access-4czdh\") pod \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.178304 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-catalog-content\") pod \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.178487 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-utilities\") pod \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\" (UID: \"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731\") " Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.179735 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-utilities" (OuterVolumeSpecName: "utilities") pod "7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" (UID: "7c0e6442-ff3d-4d6d-b54a-69ea6da1e731"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.186116 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-kube-api-access-4czdh" (OuterVolumeSpecName: "kube-api-access-4czdh") pod "7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" (UID: "7c0e6442-ff3d-4d6d-b54a-69ea6da1e731"). InnerVolumeSpecName "kube-api-access-4czdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.216840 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-whbn9"] Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.272371 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" (UID: "7c0e6442-ff3d-4d6d-b54a-69ea6da1e731"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.280977 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28jls\" (UniqueName: \"kubernetes.io/projected/528a996a-b805-4638-a027-d52d8733d109-kube-api-access-28jls\") pod \"528a996a-b805-4638-a027-d52d8733d109\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.281409 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-catalog-content\") pod \"528a996a-b805-4638-a027-d52d8733d109\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.281607 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-utilities\") pod \"528a996a-b805-4638-a027-d52d8733d109\" (UID: \"528a996a-b805-4638-a027-d52d8733d109\") " Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.282460 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.282558 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.282621 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4czdh\" (UniqueName: \"kubernetes.io/projected/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731-kube-api-access-4czdh\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.283251 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-utilities" (OuterVolumeSpecName: "utilities") pod "528a996a-b805-4638-a027-d52d8733d109" (UID: "528a996a-b805-4638-a027-d52d8733d109"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.285537 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528a996a-b805-4638-a027-d52d8733d109-kube-api-access-28jls" (OuterVolumeSpecName: "kube-api-access-28jls") pod "528a996a-b805-4638-a027-d52d8733d109" (UID: "528a996a-b805-4638-a027-d52d8733d109"). InnerVolumeSpecName "kube-api-access-28jls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.299643 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "528a996a-b805-4638-a027-d52d8733d109" (UID: "528a996a-b805-4638-a027-d52d8733d109"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.412742 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28jls\" (UniqueName: \"kubernetes.io/projected/528a996a-b805-4638-a027-d52d8733d109-kube-api-access-28jls\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.412771 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.412780 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528a996a-b805-4638-a027-d52d8733d109-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.731466 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96dx9" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.731472 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96dx9" event={"ID":"528a996a-b805-4638-a027-d52d8733d109","Type":"ContainerDied","Data":"43fb26800aed6b188f6ec937b20150373d5bb55b6ec0040527fce3ff30417561"} Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.732383 4740 scope.go:117] "RemoveContainer" containerID="553a264f5686d1e8fadffb83844ea2b91955b19fb158568138c80a7813c60197" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.733712 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" event={"ID":"d20a172f-298d-47ac-a858-cd1600b65c4e","Type":"ContainerStarted","Data":"dc788732407f2bc98a0e46f77b07e57c4fe2f0f3bdb59a9bea790a30d30ba4b2"} Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.742288 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whbn9" event={"ID":"728565b1-c651-4b9e-a279-75ecf0e4eeb2","Type":"ContainerStarted","Data":"a210d1c84aeea35f981ac79229597b02157dd1f830949ec84b1c1b524e187feb"} Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.755231 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh68f" event={"ID":"7c0e6442-ff3d-4d6d-b54a-69ea6da1e731","Type":"ContainerDied","Data":"c28bf85ebf634f07efa58a6bf38e5f0000c860831927adebc1e09326545a69ef"} Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.755318 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mh68f" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.769565 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" podStartSLOduration=3.5182768639999997 podStartE2EDuration="15.769525401s" podCreationTimestamp="2026-01-05 14:14:05 +0000 UTC" firstStartedPulling="2026-01-05 14:14:07.381022568 +0000 UTC m=+1496.687931147" lastFinishedPulling="2026-01-05 14:14:19.632271105 +0000 UTC m=+1508.939179684" observedRunningTime="2026-01-05 14:14:20.753635803 +0000 UTC m=+1510.060544382" watchObservedRunningTime="2026-01-05 14:14:20.769525401 +0000 UTC m=+1510.076433970" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.785593 4740 scope.go:117] "RemoveContainer" containerID="a83f6672ec1df258da11a1e585d7447a75a7b5d235fb12bb2aa437b896b67da8" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.835227 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96dx9"] Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.855380 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-96dx9"] Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.872484 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mh68f"] Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.884030 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mh68f"] Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.898961 4740 scope.go:117] "RemoveContainer" containerID="cfd9d689a483a8848647ca99eb34c99f4682ad2f5efd05f6a59f94c08c64951b" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.931663 4740 scope.go:117] "RemoveContainer" containerID="a88fe9e49b6ac1ab3cc8f1ee049e536ff3daaa24490fdde9c767c7ac6fd56bd5" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.979751 4740 scope.go:117] "RemoveContainer" containerID="1abf241e5ab43fb35d5992dfaf9d5a59f2583f79c13154922cb87e9d5753eb60" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.986512 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528a996a-b805-4638-a027-d52d8733d109" path="/var/lib/kubelet/pods/528a996a-b805-4638-a027-d52d8733d109/volumes" Jan 05 14:14:20 crc kubenswrapper[4740]: I0105 14:14:20.987728 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" path="/var/lib/kubelet/pods/7c0e6442-ff3d-4d6d-b54a-69ea6da1e731/volumes" Jan 05 14:14:21 crc kubenswrapper[4740]: I0105 14:14:21.087116 4740 scope.go:117] "RemoveContainer" containerID="5157bca55b6171d4c2d69378642d9f0bda9897da8a571730702c86d93823b1e3" Jan 05 14:14:23 crc kubenswrapper[4740]: I0105 14:14:23.799857 4740 generic.go:334] "Generic (PLEG): container finished" podID="42341a11-18a3-4ef1-8e12-a9450f68370e" containerID="0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374" exitCode=0 Jan 05 14:14:23 crc kubenswrapper[4740]: I0105 14:14:23.800090 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d7697fc9-j6p2f" event={"ID":"42341a11-18a3-4ef1-8e12-a9450f68370e","Type":"ContainerDied","Data":"0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374"} Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.768838 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.835263 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d7697fc9-j6p2f" event={"ID":"42341a11-18a3-4ef1-8e12-a9450f68370e","Type":"ContainerDied","Data":"e53b5163e6783a5ee921103e1e5209438505b378f377f822cbd66c917f18352c"} Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.835314 4740 scope.go:117] "RemoveContainer" containerID="0e9314facab50354dadf8e2c084c481422c6ac5e4073d436b095728d6d4b9374" Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.835443 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d7697fc9-j6p2f" Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.951709 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krvt6\" (UniqueName: \"kubernetes.io/projected/42341a11-18a3-4ef1-8e12-a9450f68370e-kube-api-access-krvt6\") pod \"42341a11-18a3-4ef1-8e12-a9450f68370e\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.952434 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data\") pod \"42341a11-18a3-4ef1-8e12-a9450f68370e\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.952594 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-combined-ca-bundle\") pod \"42341a11-18a3-4ef1-8e12-a9450f68370e\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.952879 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data-custom\") pod \"42341a11-18a3-4ef1-8e12-a9450f68370e\" (UID: \"42341a11-18a3-4ef1-8e12-a9450f68370e\") " Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.980593 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42341a11-18a3-4ef1-8e12-a9450f68370e" (UID: "42341a11-18a3-4ef1-8e12-a9450f68370e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.985507 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42341a11-18a3-4ef1-8e12-a9450f68370e-kube-api-access-krvt6" (OuterVolumeSpecName: "kube-api-access-krvt6") pod "42341a11-18a3-4ef1-8e12-a9450f68370e" (UID: "42341a11-18a3-4ef1-8e12-a9450f68370e"). InnerVolumeSpecName "kube-api-access-krvt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:26 crc kubenswrapper[4740]: I0105 14:14:26.999840 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42341a11-18a3-4ef1-8e12-a9450f68370e" (UID: "42341a11-18a3-4ef1-8e12-a9450f68370e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.035654 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data" (OuterVolumeSpecName: "config-data") pod "42341a11-18a3-4ef1-8e12-a9450f68370e" (UID: "42341a11-18a3-4ef1-8e12-a9450f68370e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.060499 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krvt6\" (UniqueName: \"kubernetes.io/projected/42341a11-18a3-4ef1-8e12-a9450f68370e-kube-api-access-krvt6\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.060570 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.060601 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.060617 4740 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42341a11-18a3-4ef1-8e12-a9450f68370e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.190524 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-d7697fc9-j6p2f"] Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.206806 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-d7697fc9-j6p2f"] Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.855220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whbn9" event={"ID":"728565b1-c651-4b9e-a279-75ecf0e4eeb2","Type":"ContainerStarted","Data":"6a50e3ee2c22fcb701f57ab938316fb8e8505f8933589d28b8f45b1c088d327e"} Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.877110 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-whbn9" podStartSLOduration=2.870749654 podStartE2EDuration="9.877091349s" podCreationTimestamp="2026-01-05 14:14:18 +0000 UTC" firstStartedPulling="2026-01-05 14:14:20.230395372 +0000 UTC m=+1509.537303951" lastFinishedPulling="2026-01-05 14:14:27.236737067 +0000 UTC m=+1516.543645646" observedRunningTime="2026-01-05 14:14:27.875587059 +0000 UTC m=+1517.182495648" watchObservedRunningTime="2026-01-05 14:14:27.877091349 +0000 UTC m=+1517.183999928" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.929660 4740 scope.go:117] "RemoveContainer" containerID="94c7df84480fd76a5119a275577c87912d242bc8d1522e50a9dbfcc6aa200f59" Jan 05 14:14:27 crc kubenswrapper[4740]: I0105 14:14:27.970819 4740 scope.go:117] "RemoveContainer" containerID="edf6bcae04123e30ad5daccfc24ec9e527589649433100b25317cc7e703fe4ea" Jan 05 14:14:28 crc kubenswrapper[4740]: I0105 14:14:28.027635 4740 scope.go:117] "RemoveContainer" containerID="7045ef52d865795eb8d0b571baed744f205f5772479fa437d50c95fd3e4e3e89" Jan 05 14:14:28 crc kubenswrapper[4740]: I0105 14:14:28.052961 4740 scope.go:117] "RemoveContainer" containerID="761f5acf52bce2076b89940bc686a118270fc0ca2adeaed1f8c5697af7c01aef" Jan 05 14:14:28 crc kubenswrapper[4740]: I0105 14:14:28.129399 4740 scope.go:117] "RemoveContainer" containerID="1f37daa9eb9780e6ce82b90be0b0b660cb628f160ae46c97b52ca28948c80008" Jan 05 14:14:28 crc kubenswrapper[4740]: I0105 14:14:28.985631 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42341a11-18a3-4ef1-8e12-a9450f68370e" path="/var/lib/kubelet/pods/42341a11-18a3-4ef1-8e12-a9450f68370e/volumes" Jan 05 14:14:29 crc kubenswrapper[4740]: I0105 14:14:29.230317 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Jan 05 14:14:29 crc kubenswrapper[4740]: I0105 14:14:29.298638 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:14:30 crc kubenswrapper[4740]: I0105 14:14:30.898883 4740 generic.go:334] "Generic (PLEG): container finished" podID="728565b1-c651-4b9e-a279-75ecf0e4eeb2" containerID="6a50e3ee2c22fcb701f57ab938316fb8e8505f8933589d28b8f45b1c088d327e" exitCode=0 Jan 05 14:14:30 crc kubenswrapper[4740]: I0105 14:14:30.898988 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whbn9" event={"ID":"728565b1-c651-4b9e-a279-75ecf0e4eeb2","Type":"ContainerDied","Data":"6a50e3ee2c22fcb701f57ab938316fb8e8505f8933589d28b8f45b1c088d327e"} Jan 05 14:14:31 crc kubenswrapper[4740]: I0105 14:14:31.917159 4740 generic.go:334] "Generic (PLEG): container finished" podID="d20a172f-298d-47ac-a858-cd1600b65c4e" containerID="dc788732407f2bc98a0e46f77b07e57c4fe2f0f3bdb59a9bea790a30d30ba4b2" exitCode=0 Jan 05 14:14:31 crc kubenswrapper[4740]: I0105 14:14:31.917203 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" event={"ID":"d20a172f-298d-47ac-a858-cd1600b65c4e","Type":"ContainerDied","Data":"dc788732407f2bc98a0e46f77b07e57c4fe2f0f3bdb59a9bea790a30d30ba4b2"} Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.363247 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.456913 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4vjz\" (UniqueName: \"kubernetes.io/projected/728565b1-c651-4b9e-a279-75ecf0e4eeb2-kube-api-access-p4vjz\") pod \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.457204 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-scripts\") pod \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.457326 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-config-data\") pod \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.457403 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-combined-ca-bundle\") pod \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\" (UID: \"728565b1-c651-4b9e-a279-75ecf0e4eeb2\") " Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.462706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728565b1-c651-4b9e-a279-75ecf0e4eeb2-kube-api-access-p4vjz" (OuterVolumeSpecName: "kube-api-access-p4vjz") pod "728565b1-c651-4b9e-a279-75ecf0e4eeb2" (UID: "728565b1-c651-4b9e-a279-75ecf0e4eeb2"). InnerVolumeSpecName "kube-api-access-p4vjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.462891 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-scripts" (OuterVolumeSpecName: "scripts") pod "728565b1-c651-4b9e-a279-75ecf0e4eeb2" (UID: "728565b1-c651-4b9e-a279-75ecf0e4eeb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.504523 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-config-data" (OuterVolumeSpecName: "config-data") pod "728565b1-c651-4b9e-a279-75ecf0e4eeb2" (UID: "728565b1-c651-4b9e-a279-75ecf0e4eeb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.518586 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728565b1-c651-4b9e-a279-75ecf0e4eeb2" (UID: "728565b1-c651-4b9e-a279-75ecf0e4eeb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.559861 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4vjz\" (UniqueName: \"kubernetes.io/projected/728565b1-c651-4b9e-a279-75ecf0e4eeb2-kube-api-access-p4vjz\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.559902 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.559914 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.559929 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728565b1-c651-4b9e-a279-75ecf0e4eeb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.932787 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whbn9" event={"ID":"728565b1-c651-4b9e-a279-75ecf0e4eeb2","Type":"ContainerDied","Data":"a210d1c84aeea35f981ac79229597b02157dd1f830949ec84b1c1b524e187feb"} Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.933148 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a210d1c84aeea35f981ac79229597b02157dd1f830949ec84b1c1b524e187feb" Jan 05 14:14:32 crc kubenswrapper[4740]: I0105 14:14:32.932914 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whbn9" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.451814 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.482198 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-ssh-key\") pod \"d20a172f-298d-47ac-a858-cd1600b65c4e\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.482341 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-inventory\") pod \"d20a172f-298d-47ac-a858-cd1600b65c4e\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.482373 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9jh8\" (UniqueName: \"kubernetes.io/projected/d20a172f-298d-47ac-a858-cd1600b65c4e-kube-api-access-p9jh8\") pod \"d20a172f-298d-47ac-a858-cd1600b65c4e\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.482623 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-repo-setup-combined-ca-bundle\") pod \"d20a172f-298d-47ac-a858-cd1600b65c4e\" (UID: \"d20a172f-298d-47ac-a858-cd1600b65c4e\") " Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.531249 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d20a172f-298d-47ac-a858-cd1600b65c4e" (UID: "d20a172f-298d-47ac-a858-cd1600b65c4e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.531306 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20a172f-298d-47ac-a858-cd1600b65c4e-kube-api-access-p9jh8" (OuterVolumeSpecName: "kube-api-access-p9jh8") pod "d20a172f-298d-47ac-a858-cd1600b65c4e" (UID: "d20a172f-298d-47ac-a858-cd1600b65c4e"). InnerVolumeSpecName "kube-api-access-p9jh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.567217 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d20a172f-298d-47ac-a858-cd1600b65c4e" (UID: "d20a172f-298d-47ac-a858-cd1600b65c4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.589565 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9jh8\" (UniqueName: \"kubernetes.io/projected/d20a172f-298d-47ac-a858-cd1600b65c4e-kube-api-access-p9jh8\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.589605 4740 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.589621 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.596422 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-inventory" (OuterVolumeSpecName: "inventory") pod "d20a172f-298d-47ac-a858-cd1600b65c4e" (UID: "d20a172f-298d-47ac-a858-cd1600b65c4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.632009 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.632274 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-api" containerID="cri-o://cd8c32c9e8e2b0ea3fd487bd88c97bad3c4291d7986b61f79148c332b9ed1663" gracePeriod=30 Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.632746 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-notifier" containerID="cri-o://87dda8af178e5c6e53129ed4da8013810d0bbed083a8d069aaca81ddd115d5b6" gracePeriod=30 Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.632850 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-listener" containerID="cri-o://236541ca6123d7d5583d91aeb48255b7f86cc79c0fd2ca95738893b0d0a7d14f" gracePeriod=30 Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.633887 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-evaluator" containerID="cri-o://278c2034c407eb00ceb629ad204f5ba2557b2d84ec7b3deca21971b273afa162" gracePeriod=30 Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.694870 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d20a172f-298d-47ac-a858-cd1600b65c4e-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.946250 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" event={"ID":"d20a172f-298d-47ac-a858-cd1600b65c4e","Type":"ContainerDied","Data":"085747bf90c9c3e22c69b0c7fdc79552bdf9ce7b643e02c373b6cb11747a393a"} Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.947313 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="085747bf90c9c3e22c69b0c7fdc79552bdf9ce7b643e02c373b6cb11747a393a" Jan 05 14:14:33 crc kubenswrapper[4740]: I0105 14:14:33.946511 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-flmts" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.418473 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j"] Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419166 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="extract-utilities" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419183 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="extract-utilities" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419192 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="extract-content" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419198 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="extract-content" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419209 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="extract-content" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419215 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="extract-content" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419231 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="registry-server" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419237 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="registry-server" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419246 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="extract-utilities" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419254 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="extract-utilities" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419267 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728565b1-c651-4b9e-a279-75ecf0e4eeb2" containerName="aodh-db-sync" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419273 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="728565b1-c651-4b9e-a279-75ecf0e4eeb2" containerName="aodh-db-sync" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419295 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42341a11-18a3-4ef1-8e12-a9450f68370e" containerName="heat-engine" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419300 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="42341a11-18a3-4ef1-8e12-a9450f68370e" containerName="heat-engine" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419315 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="registry-server" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419323 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="registry-server" Jan 05 14:14:34 crc kubenswrapper[4740]: E0105 14:14:34.419339 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20a172f-298d-47ac-a858-cd1600b65c4e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419346 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20a172f-298d-47ac-a858-cd1600b65c4e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419562 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0e6442-ff3d-4d6d-b54a-69ea6da1e731" containerName="registry-server" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419584 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="42341a11-18a3-4ef1-8e12-a9450f68370e" containerName="heat-engine" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419600 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20a172f-298d-47ac-a858-cd1600b65c4e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419613 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="728565b1-c651-4b9e-a279-75ecf0e4eeb2" containerName="aodh-db-sync" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.419624 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="528a996a-b805-4638-a027-d52d8733d109" containerName="registry-server" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.420460 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.423297 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.424422 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.424523 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.424689 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.440280 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j"] Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.530975 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.531430 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/35e15f1f-4511-493c-8e96-8446fb0b7b14-kube-api-access-6d7pm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.531537 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.634186 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.634378 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.634549 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/35e15f1f-4511-493c-8e96-8446fb0b7b14-kube-api-access-6d7pm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.639845 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.639856 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.660865 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/35e15f1f-4511-493c-8e96-8446fb0b7b14-kube-api-access-6d7pm\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xkn8j\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:34 crc kubenswrapper[4740]: I0105 14:14:34.738787 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:35 crc kubenswrapper[4740]: I0105 14:14:35.296540 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j"] Jan 05 14:14:35 crc kubenswrapper[4740]: I0105 14:14:35.402845 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="rabbitmq" containerID="cri-o://9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf" gracePeriod=604794 Jan 05 14:14:35 crc kubenswrapper[4740]: I0105 14:14:35.972414 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" event={"ID":"35e15f1f-4511-493c-8e96-8446fb0b7b14","Type":"ContainerStarted","Data":"1fd6b9110e6f42035be0500b5ec65a851b92a67a9582219bf47dc42ccb7bcc42"} Jan 05 14:14:36 crc kubenswrapper[4740]: I0105 14:14:36.688023 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.137:5671: connect: connection refused" Jan 05 14:14:36 crc kubenswrapper[4740]: I0105 14:14:36.990382 4740 generic.go:334] "Generic (PLEG): container finished" podID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerID="278c2034c407eb00ceb629ad204f5ba2557b2d84ec7b3deca21971b273afa162" exitCode=0 Jan 05 14:14:36 crc kubenswrapper[4740]: I0105 14:14:36.990703 4740 generic.go:334] "Generic (PLEG): container finished" podID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerID="cd8c32c9e8e2b0ea3fd487bd88c97bad3c4291d7986b61f79148c332b9ed1663" exitCode=0 Jan 05 14:14:36 crc kubenswrapper[4740]: I0105 14:14:36.990729 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerDied","Data":"278c2034c407eb00ceb629ad204f5ba2557b2d84ec7b3deca21971b273afa162"} Jan 05 14:14:36 crc kubenswrapper[4740]: I0105 14:14:36.990756 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerDied","Data":"cd8c32c9e8e2b0ea3fd487bd88c97bad3c4291d7986b61f79148c332b9ed1663"} Jan 05 14:14:38 crc kubenswrapper[4740]: I0105 14:14:38.009706 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" event={"ID":"35e15f1f-4511-493c-8e96-8446fb0b7b14","Type":"ContainerStarted","Data":"d57d6d525fa9bde9abfed6ff3bb9d9a33e367bfe895c62e6971516a81872b476"} Jan 05 14:14:38 crc kubenswrapper[4740]: I0105 14:14:38.030657 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" podStartSLOduration=3.320744984 podStartE2EDuration="4.030637917s" podCreationTimestamp="2026-01-05 14:14:34 +0000 UTC" firstStartedPulling="2026-01-05 14:14:35.319512641 +0000 UTC m=+1524.626421220" lastFinishedPulling="2026-01-05 14:14:36.029405574 +0000 UTC m=+1525.336314153" observedRunningTime="2026-01-05 14:14:38.027176553 +0000 UTC m=+1527.334085182" watchObservedRunningTime="2026-01-05 14:14:38.030637917 +0000 UTC m=+1527.337546506" Jan 05 14:14:41 crc kubenswrapper[4740]: I0105 14:14:41.063767 4740 generic.go:334] "Generic (PLEG): container finished" podID="35e15f1f-4511-493c-8e96-8446fb0b7b14" containerID="d57d6d525fa9bde9abfed6ff3bb9d9a33e367bfe895c62e6971516a81872b476" exitCode=0 Jan 05 14:14:41 crc kubenswrapper[4740]: I0105 14:14:41.063985 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" event={"ID":"35e15f1f-4511-493c-8e96-8446fb0b7b14","Type":"ContainerDied","Data":"d57d6d525fa9bde9abfed6ff3bb9d9a33e367bfe895c62e6971516a81872b476"} Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.083201 4740 generic.go:334] "Generic (PLEG): container finished" podID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerID="87dda8af178e5c6e53129ed4da8013810d0bbed083a8d069aaca81ddd115d5b6" exitCode=0 Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.083265 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerDied","Data":"87dda8af178e5c6e53129ed4da8013810d0bbed083a8d069aaca81ddd115d5b6"} Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.633029 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.709411 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.757128 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-inventory\") pod \"35e15f1f-4511-493c-8e96-8446fb0b7b14\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.757199 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/35e15f1f-4511-493c-8e96-8446fb0b7b14-kube-api-access-6d7pm\") pod \"35e15f1f-4511-493c-8e96-8446fb0b7b14\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.757255 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-ssh-key\") pod \"35e15f1f-4511-493c-8e96-8446fb0b7b14\" (UID: \"35e15f1f-4511-493c-8e96-8446fb0b7b14\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.768661 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e15f1f-4511-493c-8e96-8446fb0b7b14-kube-api-access-6d7pm" (OuterVolumeSpecName: "kube-api-access-6d7pm") pod "35e15f1f-4511-493c-8e96-8446fb0b7b14" (UID: "35e15f1f-4511-493c-8e96-8446fb0b7b14"). InnerVolumeSpecName "kube-api-access-6d7pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.798439 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-inventory" (OuterVolumeSpecName: "inventory") pod "35e15f1f-4511-493c-8e96-8446fb0b7b14" (UID: "35e15f1f-4511-493c-8e96-8446fb0b7b14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.799628 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35e15f1f-4511-493c-8e96-8446fb0b7b14" (UID: "35e15f1f-4511-493c-8e96-8446fb0b7b14"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.859684 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4396968c-d77b-434d-888f-3ab578514bbe-erlang-cookie-secret\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.859740 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-erlang-cookie\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860233 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-server-conf\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860261 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-tls\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860277 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-config-data\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860297 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860355 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-confd\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860410 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-plugins\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860447 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4396968c-d77b-434d-888f-3ab578514bbe-pod-info\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860530 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k26n\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-kube-api-access-2k26n\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.860650 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-plugins-conf\") pod \"4396968c-d77b-434d-888f-3ab578514bbe\" (UID: \"4396968c-d77b-434d-888f-3ab578514bbe\") " Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.861235 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.861256 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/35e15f1f-4511-493c-8e96-8446fb0b7b14-kube-api-access-6d7pm\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.861267 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35e15f1f-4511-493c-8e96-8446fb0b7b14-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.861276 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.861575 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.862000 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.864354 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-kube-api-access-2k26n" (OuterVolumeSpecName: "kube-api-access-2k26n") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "kube-api-access-2k26n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.864445 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4396968c-d77b-434d-888f-3ab578514bbe-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.865839 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.937597 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-server-conf" (OuterVolumeSpecName: "server-conf") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.963711 4740 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-server-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.963747 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.963761 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.963773 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k26n\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-kube-api-access-2k26n\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.963783 4740 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.963794 4740 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4396968c-d77b-434d-888f-3ab578514bbe-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:42 crc kubenswrapper[4740]: I0105 14:14:42.997019 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.066517 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4396968c-d77b-434d-888f-3ab578514bbe-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.106833 4740 generic.go:334] "Generic (PLEG): container finished" podID="4396968c-d77b-434d-888f-3ab578514bbe" containerID="9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf" exitCode=0 Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.106906 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4396968c-d77b-434d-888f-3ab578514bbe","Type":"ContainerDied","Data":"9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf"} Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.106939 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4396968c-d77b-434d-888f-3ab578514bbe","Type":"ContainerDied","Data":"393e2fc8f51515016829a7510a15cb7318f43a9f29dcadd9b073868599335d61"} Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.106959 4740 scope.go:117] "RemoveContainer" containerID="9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.107170 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.113697 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" event={"ID":"35e15f1f-4511-493c-8e96-8446fb0b7b14","Type":"ContainerDied","Data":"1fd6b9110e6f42035be0500b5ec65a851b92a67a9582219bf47dc42ccb7bcc42"} Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.113991 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd6b9110e6f42035be0500b5ec65a851b92a67a9582219bf47dc42ccb7bcc42" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.113755 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xkn8j" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.884170 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4396968c-d77b-434d-888f-3ab578514bbe-pod-info" (OuterVolumeSpecName: "pod-info") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.884308 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-config-data" (OuterVolumeSpecName: "config-data") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.888586 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4396968c-d77b-434d-888f-3ab578514bbe-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.888614 4740 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4396968c-d77b-434d-888f-3ab578514bbe-pod-info\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:43 crc kubenswrapper[4740]: I0105 14:14:43.917778 4740 scope.go:117] "RemoveContainer" containerID="993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.092184 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a" (OuterVolumeSpecName: "persistence") pod "4396968c-d77b-434d-888f-3ab578514bbe" (UID: "4396968c-d77b-434d-888f-3ab578514bbe"). InnerVolumeSpecName "pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.108182 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") on node \"crc\" " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.165053 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx"] Jan 05 14:14:44 crc kubenswrapper[4740]: E0105 14:14:44.165803 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="rabbitmq" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.165825 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="rabbitmq" Jan 05 14:14:44 crc kubenswrapper[4740]: E0105 14:14:44.165864 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="setup-container" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.165873 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="setup-container" Jan 05 14:14:44 crc kubenswrapper[4740]: E0105 14:14:44.165897 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e15f1f-4511-493c-8e96-8446fb0b7b14" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.165906 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e15f1f-4511-493c-8e96-8446fb0b7b14" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.166226 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4396968c-d77b-434d-888f-3ab578514bbe" containerName="rabbitmq" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.166264 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e15f1f-4511-493c-8e96-8446fb0b7b14" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.167369 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.173053 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.179216 4740 scope.go:117] "RemoveContainer" containerID="9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.179694 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.179908 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.180054 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:14:44 crc kubenswrapper[4740]: E0105 14:14:44.180364 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf\": container with ID starting with 9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf not found: ID does not exist" containerID="9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.180475 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf"} err="failed to get container status \"9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf\": rpc error: code = NotFound desc = could not find container \"9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf\": container with ID starting with 9c471ebbc6929cb56e3c7e7bca0c77bb703fd4061fb6e8de0bccfc7f59fb3bcf not found: ID does not exist" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.180580 4740 scope.go:117] "RemoveContainer" containerID="993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1" Jan 05 14:14:44 crc kubenswrapper[4740]: E0105 14:14:44.181335 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1\": container with ID starting with 993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1 not found: ID does not exist" containerID="993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.181388 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1"} err="failed to get container status \"993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1\": rpc error: code = NotFound desc = could not find container \"993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1\": container with ID starting with 993c05e80bcb4da74365c39514c46be69f849324868d0b66d705303f726b6ac1 not found: ID does not exist" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.203855 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx"] Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.222290 4740 generic.go:334] "Generic (PLEG): container finished" podID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerID="236541ca6123d7d5583d91aeb48255b7f86cc79c0fd2ca95738893b0d0a7d14f" exitCode=0 Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.222376 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerDied","Data":"236541ca6123d7d5583d91aeb48255b7f86cc79c0fd2ca95738893b0d0a7d14f"} Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.258149 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.258775 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a") on node "crc" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.314308 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.314375 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvfl\" (UniqueName: \"kubernetes.io/projected/50749d1a-ca02-4a0c-8f55-522f5d53497a-kube-api-access-nvvfl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.314407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.314505 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.314571 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.362953 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.380683 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.393289 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.395488 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.403900 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.421408 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvfl\" (UniqueName: \"kubernetes.io/projected/50749d1a-ca02-4a0c-8f55-522f5d53497a-kube-api-access-nvvfl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.421458 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.421580 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.421684 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.427198 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.427866 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.438001 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvfl\" (UniqueName: \"kubernetes.io/projected/50749d1a-ca02-4a0c-8f55-522f5d53497a-kube-api-access-nvvfl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.445044 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-997gx\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.493493 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.548722 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.548794 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.549043 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6czl\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-kube-api-access-q6czl\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.549117 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.549161 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.549315 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-server-conf\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.549337 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.549901 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93d7209e-3012-42db-a76c-cd020634e3c4-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.550111 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.550223 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93d7209e-3012-42db-a76c-cd020634e3c4-pod-info\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.550266 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-config-data\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.651838 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.651877 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.651938 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6czl\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-kube-api-access-q6czl\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.651962 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.651986 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.652054 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-server-conf\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.652122 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.652162 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93d7209e-3012-42db-a76c-cd020634e3c4-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.652221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.652276 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93d7209e-3012-42db-a76c-cd020634e3c4-pod-info\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.652293 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-config-data\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.653625 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-config-data\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.658999 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.659733 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.659755 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db0cb9ade84c891cd0bb9eb2fcd30e636936804d545309ca85d72d0417d04c94/globalmount\"" pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.660611 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.661146 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-server-conf\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.661304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93d7209e-3012-42db-a76c-cd020634e3c4-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.661450 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.663335 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93d7209e-3012-42db-a76c-cd020634e3c4-pod-info\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.667285 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93d7209e-3012-42db-a76c-cd020634e3c4-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.668879 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.702092 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6czl\" (UniqueName: \"kubernetes.io/projected/93d7209e-3012-42db-a76c-cd020634e3c4-kube-api-access-q6czl\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.767436 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.811364 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6cc93836-9eae-4cb7-9a53-dfc2a39d469a\") pod \"rabbitmq-server-1\" (UID: \"93d7209e-3012-42db-a76c-cd020634e3c4\") " pod="openstack/rabbitmq-server-1" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.861552 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-internal-tls-certs\") pod \"96b6a192-1efc-47ce-9c5b-26539409d69c\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.861877 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-combined-ca-bundle\") pod \"96b6a192-1efc-47ce-9c5b-26539409d69c\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.862340 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64445\" (UniqueName: \"kubernetes.io/projected/96b6a192-1efc-47ce-9c5b-26539409d69c-kube-api-access-64445\") pod \"96b6a192-1efc-47ce-9c5b-26539409d69c\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.862682 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-config-data\") pod \"96b6a192-1efc-47ce-9c5b-26539409d69c\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.862841 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-public-tls-certs\") pod \"96b6a192-1efc-47ce-9c5b-26539409d69c\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.862954 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-scripts\") pod \"96b6a192-1efc-47ce-9c5b-26539409d69c\" (UID: \"96b6a192-1efc-47ce-9c5b-26539409d69c\") " Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.866211 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-scripts" (OuterVolumeSpecName: "scripts") pod "96b6a192-1efc-47ce-9c5b-26539409d69c" (UID: "96b6a192-1efc-47ce-9c5b-26539409d69c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.867800 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b6a192-1efc-47ce-9c5b-26539409d69c-kube-api-access-64445" (OuterVolumeSpecName: "kube-api-access-64445") pod "96b6a192-1efc-47ce-9c5b-26539409d69c" (UID: "96b6a192-1efc-47ce-9c5b-26539409d69c"). InnerVolumeSpecName "kube-api-access-64445". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.924242 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "96b6a192-1efc-47ce-9c5b-26539409d69c" (UID: "96b6a192-1efc-47ce-9c5b-26539409d69c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.939363 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "96b6a192-1efc-47ce-9c5b-26539409d69c" (UID: "96b6a192-1efc-47ce-9c5b-26539409d69c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.998473 4740 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.998500 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64445\" (UniqueName: \"kubernetes.io/projected/96b6a192-1efc-47ce-9c5b-26539409d69c-kube-api-access-64445\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.998509 4740 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:44 crc kubenswrapper[4740]: I0105 14:14:44.998522 4740 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-scripts\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.029911 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.035864 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4396968c-d77b-434d-888f-3ab578514bbe" path="/var/lib/kubelet/pods/4396968c-d77b-434d-888f-3ab578514bbe/volumes" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.036878 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-config-data" (OuterVolumeSpecName: "config-data") pod "96b6a192-1efc-47ce-9c5b-26539409d69c" (UID: "96b6a192-1efc-47ce-9c5b-26539409d69c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.073705 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96b6a192-1efc-47ce-9c5b-26539409d69c" (UID: "96b6a192-1efc-47ce-9c5b-26539409d69c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.104549 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.104586 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b6a192-1efc-47ce-9c5b-26539409d69c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.220809 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx"] Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.262748 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"96b6a192-1efc-47ce-9c5b-26539409d69c","Type":"ContainerDied","Data":"390be8b058510812af0d6eb9aa8fcc8cc906da6e594756eced3a474e31b6fac6"} Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.262859 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.266359 4740 scope.go:117] "RemoveContainer" containerID="236541ca6123d7d5583d91aeb48255b7f86cc79c0fd2ca95738893b0d0a7d14f" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.267867 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" event={"ID":"50749d1a-ca02-4a0c-8f55-522f5d53497a","Type":"ContainerStarted","Data":"5c6082a97b9122df4df7c5fead45ecfc361c09d05dd73b0aa3b5a294472fd9c8"} Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.305335 4740 scope.go:117] "RemoveContainer" containerID="87dda8af178e5c6e53129ed4da8013810d0bbed083a8d069aaca81ddd115d5b6" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.336605 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.347437 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.358765 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Jan 05 14:14:45 crc kubenswrapper[4740]: E0105 14:14:45.359403 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-listener" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359418 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-listener" Jan 05 14:14:45 crc kubenswrapper[4740]: E0105 14:14:45.359440 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-api" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359446 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-api" Jan 05 14:14:45 crc kubenswrapper[4740]: E0105 14:14:45.359457 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-notifier" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359463 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-notifier" Jan 05 14:14:45 crc kubenswrapper[4740]: E0105 14:14:45.359481 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-evaluator" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359488 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-evaluator" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359728 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-api" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359744 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-notifier" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359754 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-listener" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.359770 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" containerName="aodh-evaluator" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.361951 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.363653 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.365684 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f8s9l" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.365819 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.365935 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.366015 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.372400 4740 scope.go:117] "RemoveContainer" containerID="278c2034c407eb00ceb629ad204f5ba2557b2d84ec7b3deca21971b273afa162" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.373990 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.399671 4740 scope.go:117] "RemoveContainer" containerID="cd8c32c9e8e2b0ea3fd487bd88c97bad3c4291d7986b61f79148c332b9ed1663" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.522488 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Jan 05 14:14:45 crc kubenswrapper[4740]: W0105 14:14:45.525292 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93d7209e_3012_42db_a76c_cd020634e3c4.slice/crio-286d1fb488592cdfe3a0dd1a7e7d8f718e81abf5dd3ed22331840c4eec398588 WatchSource:0}: Error finding container 286d1fb488592cdfe3a0dd1a7e7d8f718e81abf5dd3ed22331840c4eec398588: Status 404 returned error can't find the container with id 286d1fb488592cdfe3a0dd1a7e7d8f718e81abf5dd3ed22331840c4eec398588 Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.529853 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-scripts\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.529960 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.529983 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-internal-tls-certs\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.530483 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-config-data\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.530556 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-public-tls-certs\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.530624 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsrpw\" (UniqueName: \"kubernetes.io/projected/0622aaac-01e2-4f75-8de9-49db879c1703-kube-api-access-xsrpw\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.632155 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-public-tls-certs\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.632229 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsrpw\" (UniqueName: \"kubernetes.io/projected/0622aaac-01e2-4f75-8de9-49db879c1703-kube-api-access-xsrpw\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.632319 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-scripts\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.632392 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.632409 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-internal-tls-certs\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.632455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-config-data\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.638115 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-public-tls-certs\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.638257 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-config-data\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.639239 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-internal-tls-certs\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.639952 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.641800 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0622aaac-01e2-4f75-8de9-49db879c1703-scripts\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.654350 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsrpw\" (UniqueName: \"kubernetes.io/projected/0622aaac-01e2-4f75-8de9-49db879c1703-kube-api-access-xsrpw\") pod \"aodh-0\" (UID: \"0622aaac-01e2-4f75-8de9-49db879c1703\") " pod="openstack/aodh-0" Jan 05 14:14:45 crc kubenswrapper[4740]: I0105 14:14:45.678935 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Jan 05 14:14:46 crc kubenswrapper[4740]: I0105 14:14:46.195835 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Jan 05 14:14:46 crc kubenswrapper[4740]: I0105 14:14:46.277026 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" event={"ID":"50749d1a-ca02-4a0c-8f55-522f5d53497a","Type":"ContainerStarted","Data":"e2284858d3f5dc879c6262a346e85b3537450b1a3799d9132bf0fd1418038c90"} Jan 05 14:14:46 crc kubenswrapper[4740]: I0105 14:14:46.278897 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0622aaac-01e2-4f75-8de9-49db879c1703","Type":"ContainerStarted","Data":"809cb229a896ee08a9049affe37624f9520648cf369094eb0c79b4024cd31f0b"} Jan 05 14:14:46 crc kubenswrapper[4740]: I0105 14:14:46.281714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"93d7209e-3012-42db-a76c-cd020634e3c4","Type":"ContainerStarted","Data":"286d1fb488592cdfe3a0dd1a7e7d8f718e81abf5dd3ed22331840c4eec398588"} Jan 05 14:14:46 crc kubenswrapper[4740]: I0105 14:14:46.294840 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" podStartSLOduration=1.609158001 podStartE2EDuration="2.294823302s" podCreationTimestamp="2026-01-05 14:14:44 +0000 UTC" firstStartedPulling="2026-01-05 14:14:45.129194692 +0000 UTC m=+1534.436103271" lastFinishedPulling="2026-01-05 14:14:45.814859993 +0000 UTC m=+1535.121768572" observedRunningTime="2026-01-05 14:14:46.292408217 +0000 UTC m=+1535.599316806" watchObservedRunningTime="2026-01-05 14:14:46.294823302 +0000 UTC m=+1535.601731891" Jan 05 14:14:46 crc kubenswrapper[4740]: I0105 14:14:46.988216 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b6a192-1efc-47ce-9c5b-26539409d69c" path="/var/lib/kubelet/pods/96b6a192-1efc-47ce-9c5b-26539409d69c/volumes" Jan 05 14:14:47 crc kubenswrapper[4740]: I0105 14:14:47.298913 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0622aaac-01e2-4f75-8de9-49db879c1703","Type":"ContainerStarted","Data":"4001d93fcf8aefad85067c94f319e554b39c9dc525941866c62b87747af329a1"} Jan 05 14:14:47 crc kubenswrapper[4740]: I0105 14:14:47.303505 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"93d7209e-3012-42db-a76c-cd020634e3c4","Type":"ContainerStarted","Data":"db9ab985541c39fab467824ba86b22ddb8d014a5b8074890ad7a9262b833e137"} Jan 05 14:14:49 crc kubenswrapper[4740]: I0105 14:14:49.333413 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0622aaac-01e2-4f75-8de9-49db879c1703","Type":"ContainerStarted","Data":"3954b5fd37955a473e4b78c89964f60367450a9f0e5d0fcb94df816e8f070fef"} Jan 05 14:14:50 crc kubenswrapper[4740]: I0105 14:14:50.354861 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0622aaac-01e2-4f75-8de9-49db879c1703","Type":"ContainerStarted","Data":"bec912a0958e7b55bd96c8b31ca6efb6b93a997e2ea32ceace0c7720e49a8169"} Jan 05 14:14:51 crc kubenswrapper[4740]: I0105 14:14:51.368241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0622aaac-01e2-4f75-8de9-49db879c1703","Type":"ContainerStarted","Data":"c7e825375b7dc32d9ffccef99fd64feee3e735454dc444ef84367d2777787e4c"} Jan 05 14:14:51 crc kubenswrapper[4740]: I0105 14:14:51.396367 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.8115864400000001 podStartE2EDuration="6.396332907s" podCreationTimestamp="2026-01-05 14:14:45 +0000 UTC" firstStartedPulling="2026-01-05 14:14:46.20442465 +0000 UTC m=+1535.511333229" lastFinishedPulling="2026-01-05 14:14:50.789171117 +0000 UTC m=+1540.096079696" observedRunningTime="2026-01-05 14:14:51.386508002 +0000 UTC m=+1540.693416581" watchObservedRunningTime="2026-01-05 14:14:51.396332907 +0000 UTC m=+1540.703241486" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.145525 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p"] Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.147854 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.151765 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.156734 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.160456 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p"] Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.246810 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0120f59f-644d-4257-ab41-20f87b94c07e-secret-volume\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.246868 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0120f59f-644d-4257-ab41-20f87b94c07e-config-volume\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.246927 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg9fb\" (UniqueName: \"kubernetes.io/projected/0120f59f-644d-4257-ab41-20f87b94c07e-kube-api-access-xg9fb\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.349179 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0120f59f-644d-4257-ab41-20f87b94c07e-secret-volume\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.349509 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0120f59f-644d-4257-ab41-20f87b94c07e-config-volume\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.350430 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0120f59f-644d-4257-ab41-20f87b94c07e-config-volume\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.350624 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg9fb\" (UniqueName: \"kubernetes.io/projected/0120f59f-644d-4257-ab41-20f87b94c07e-kube-api-access-xg9fb\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.356431 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0120f59f-644d-4257-ab41-20f87b94c07e-secret-volume\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.388379 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg9fb\" (UniqueName: \"kubernetes.io/projected/0120f59f-644d-4257-ab41-20f87b94c07e-kube-api-access-xg9fb\") pod \"collect-profiles-29460375-xwp9p\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:00 crc kubenswrapper[4740]: I0105 14:15:00.476213 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:01 crc kubenswrapper[4740]: I0105 14:15:01.124622 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p"] Jan 05 14:15:01 crc kubenswrapper[4740]: W0105 14:15:01.128232 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0120f59f_644d_4257_ab41_20f87b94c07e.slice/crio-7d4f5afad4c680a6d92eb59fff2319b463626906af4dfc6d3b6416e2221c6afb WatchSource:0}: Error finding container 7d4f5afad4c680a6d92eb59fff2319b463626906af4dfc6d3b6416e2221c6afb: Status 404 returned error can't find the container with id 7d4f5afad4c680a6d92eb59fff2319b463626906af4dfc6d3b6416e2221c6afb Jan 05 14:15:01 crc kubenswrapper[4740]: I0105 14:15:01.497711 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" event={"ID":"0120f59f-644d-4257-ab41-20f87b94c07e","Type":"ContainerStarted","Data":"9f390851fdde689d96b612fdd0d1cd24e7e9197f0bb49313a0bc71c2f3406d5e"} Jan 05 14:15:01 crc kubenswrapper[4740]: I0105 14:15:01.498032 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" event={"ID":"0120f59f-644d-4257-ab41-20f87b94c07e","Type":"ContainerStarted","Data":"7d4f5afad4c680a6d92eb59fff2319b463626906af4dfc6d3b6416e2221c6afb"} Jan 05 14:15:01 crc kubenswrapper[4740]: I0105 14:15:01.528845 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" podStartSLOduration=1.52882358 podStartE2EDuration="1.52882358s" podCreationTimestamp="2026-01-05 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:15:01.512345387 +0000 UTC m=+1550.819253976" watchObservedRunningTime="2026-01-05 14:15:01.52882358 +0000 UTC m=+1550.835732159" Jan 05 14:15:02 crc kubenswrapper[4740]: I0105 14:15:02.523272 4740 generic.go:334] "Generic (PLEG): container finished" podID="0120f59f-644d-4257-ab41-20f87b94c07e" containerID="9f390851fdde689d96b612fdd0d1cd24e7e9197f0bb49313a0bc71c2f3406d5e" exitCode=0 Jan 05 14:15:02 crc kubenswrapper[4740]: I0105 14:15:02.523377 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" event={"ID":"0120f59f-644d-4257-ab41-20f87b94c07e","Type":"ContainerDied","Data":"9f390851fdde689d96b612fdd0d1cd24e7e9197f0bb49313a0bc71c2f3406d5e"} Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.028718 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.157116 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0120f59f-644d-4257-ab41-20f87b94c07e-config-volume\") pod \"0120f59f-644d-4257-ab41-20f87b94c07e\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.157376 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0120f59f-644d-4257-ab41-20f87b94c07e-secret-volume\") pod \"0120f59f-644d-4257-ab41-20f87b94c07e\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.157504 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg9fb\" (UniqueName: \"kubernetes.io/projected/0120f59f-644d-4257-ab41-20f87b94c07e-kube-api-access-xg9fb\") pod \"0120f59f-644d-4257-ab41-20f87b94c07e\" (UID: \"0120f59f-644d-4257-ab41-20f87b94c07e\") " Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.158258 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0120f59f-644d-4257-ab41-20f87b94c07e-config-volume" (OuterVolumeSpecName: "config-volume") pod "0120f59f-644d-4257-ab41-20f87b94c07e" (UID: "0120f59f-644d-4257-ab41-20f87b94c07e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.163196 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0120f59f-644d-4257-ab41-20f87b94c07e-kube-api-access-xg9fb" (OuterVolumeSpecName: "kube-api-access-xg9fb") pod "0120f59f-644d-4257-ab41-20f87b94c07e" (UID: "0120f59f-644d-4257-ab41-20f87b94c07e"). InnerVolumeSpecName "kube-api-access-xg9fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.164279 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0120f59f-644d-4257-ab41-20f87b94c07e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0120f59f-644d-4257-ab41-20f87b94c07e" (UID: "0120f59f-644d-4257-ab41-20f87b94c07e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.262488 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0120f59f-644d-4257-ab41-20f87b94c07e-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.262544 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg9fb\" (UniqueName: \"kubernetes.io/projected/0120f59f-644d-4257-ab41-20f87b94c07e-kube-api-access-xg9fb\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.262567 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0120f59f-644d-4257-ab41-20f87b94c07e-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.561749 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" event={"ID":"0120f59f-644d-4257-ab41-20f87b94c07e","Type":"ContainerDied","Data":"7d4f5afad4c680a6d92eb59fff2319b463626906af4dfc6d3b6416e2221c6afb"} Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.561810 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d4f5afad4c680a6d92eb59fff2319b463626906af4dfc6d3b6416e2221c6afb" Jan 05 14:15:04 crc kubenswrapper[4740]: I0105 14:15:04.561909 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p" Jan 05 14:15:19 crc kubenswrapper[4740]: I0105 14:15:19.780520 4740 generic.go:334] "Generic (PLEG): container finished" podID="93d7209e-3012-42db-a76c-cd020634e3c4" containerID="db9ab985541c39fab467824ba86b22ddb8d014a5b8074890ad7a9262b833e137" exitCode=0 Jan 05 14:15:19 crc kubenswrapper[4740]: I0105 14:15:19.780646 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"93d7209e-3012-42db-a76c-cd020634e3c4","Type":"ContainerDied","Data":"db9ab985541c39fab467824ba86b22ddb8d014a5b8074890ad7a9262b833e137"} Jan 05 14:15:20 crc kubenswrapper[4740]: I0105 14:15:20.799354 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"93d7209e-3012-42db-a76c-cd020634e3c4","Type":"ContainerStarted","Data":"667a08ca592452c2576687dcf074b662e1508af095aeb4c63e314a5de3e65db8"} Jan 05 14:15:20 crc kubenswrapper[4740]: I0105 14:15:20.799920 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Jan 05 14:15:20 crc kubenswrapper[4740]: I0105 14:15:20.826216 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=36.82619756 podStartE2EDuration="36.82619756s" podCreationTimestamp="2026-01-05 14:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:15:20.818953276 +0000 UTC m=+1570.125861875" watchObservedRunningTime="2026-01-05 14:15:20.82619756 +0000 UTC m=+1570.133106139" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.499189 4740 scope.go:117] "RemoveContainer" containerID="31abe96eff881f2e28f34c9a99c430103b86b759958fd027c0a3fb182a9ad099" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.524861 4740 scope.go:117] "RemoveContainer" containerID="a64df19c895975cf0990e36d21938e7dea57cd874d75f3d8562b5365c131a1fc" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.561942 4740 scope.go:117] "RemoveContainer" containerID="550c71fbe0bc023fb2dd62de14263da5cec9c1da6bf1688f949207f404bec575" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.598001 4740 scope.go:117] "RemoveContainer" containerID="8e0d5775dc01bfc889f50d3f81e383edcfed499c68ec42c22984729c7a5e0863" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.688493 4740 scope.go:117] "RemoveContainer" containerID="c0da9ed00a6d44e62107d51f61fe908b308e041d3d119c398aa18734f6abff84" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.727742 4740 scope.go:117] "RemoveContainer" containerID="a7dea7bce463deeb0849a33a3a74c8beb5e29f29e178831abb5786dd73f8fca7" Jan 05 14:15:28 crc kubenswrapper[4740]: I0105 14:15:28.765109 4740 scope.go:117] "RemoveContainer" containerID="e59bce26fca4df9150498bb6e10543a8675137393eff56d64d9ec6771bc0ceb4" Jan 05 14:15:35 crc kubenswrapper[4740]: I0105 14:15:35.033257 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Jan 05 14:15:35 crc kubenswrapper[4740]: I0105 14:15:35.094004 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:15:39 crc kubenswrapper[4740]: I0105 14:15:39.460030 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerName="rabbitmq" containerID="cri-o://de7e0d44b9d033eede30158cf5b750ac579ec6dc07c237d4c01d036ff58d10fa" gracePeriod=604796 Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.162945 4740 generic.go:334] "Generic (PLEG): container finished" podID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerID="de7e0d44b9d033eede30158cf5b750ac579ec6dc07c237d4c01d036ff58d10fa" exitCode=0 Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.163033 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299","Type":"ContainerDied","Data":"de7e0d44b9d033eede30158cf5b750ac579ec6dc07c237d4c01d036ff58d10fa"} Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.163410 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299","Type":"ContainerDied","Data":"253c93cf4975e48e292f3708d36f0d8c41fe5dfc3d05db513a6a0174493b358e"} Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.163424 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253c93cf4975e48e292f3708d36f0d8c41fe5dfc3d05db513a6a0174493b358e" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.259895 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.342545 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-plugins-conf\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.342636 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-confd\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.342670 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qchns\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-kube-api-access-qchns\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.343281 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344095 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344133 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-erlang-cookie-secret\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344186 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-erlang-cookie\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344236 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-server-conf\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344265 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-plugins\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344359 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-pod-info\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344399 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-tls\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.344468 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-config-data\") pod \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\" (UID: \"3bc8ef80-7c3c-4e61-8eaf-294ab4a75299\") " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.345167 4740 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.346736 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.348811 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-kube-api-access-qchns" (OuterVolumeSpecName: "kube-api-access-qchns") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "kube-api-access-qchns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.349027 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.350293 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-pod-info" (OuterVolumeSpecName: "pod-info") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.356515 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.374313 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.383721 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df" (OuterVolumeSpecName: "persistence") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.402126 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-config-data" (OuterVolumeSpecName: "config-data") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.453774 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-server-conf" (OuterVolumeSpecName: "server-conf") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454669 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454705 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qchns\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-kube-api-access-qchns\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454744 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") on node \"crc\" " Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454759 4740 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454773 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454787 4740 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-server-conf\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454799 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454810 4740 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-pod-info\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.454821 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.491601 4740 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.491752 4740 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df") on node "crc" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.530278 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" (UID: "3bc8ef80-7c3c-4e61-8eaf-294ab4a75299"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.557156 4740 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:46 crc kubenswrapper[4740]: I0105 14:15:46.557185 4740 reconciler_common.go:293] "Volume detached for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") on node \"crc\" DevicePath \"\"" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.174724 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.199424 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.209913 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.239393 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:15:47 crc kubenswrapper[4740]: E0105 14:15:47.240139 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0120f59f-644d-4257-ab41-20f87b94c07e" containerName="collect-profiles" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.240164 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0120f59f-644d-4257-ab41-20f87b94c07e" containerName="collect-profiles" Jan 05 14:15:47 crc kubenswrapper[4740]: E0105 14:15:47.240183 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerName="rabbitmq" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.240193 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerName="rabbitmq" Jan 05 14:15:47 crc kubenswrapper[4740]: E0105 14:15:47.240225 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerName="setup-container" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.240233 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerName="setup-container" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.240511 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0120f59f-644d-4257-ab41-20f87b94c07e" containerName="collect-profiles" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.240561 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" containerName="rabbitmq" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.242218 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.250717 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.377182 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.377251 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.377994 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378206 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378343 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378457 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378480 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lddhg\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-kube-api-access-lddhg\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378609 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.378644 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480333 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480386 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480470 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480492 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lddhg\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-kube-api-access-lddhg\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480550 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.481496 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.480618 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.481581 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.482436 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.482676 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.482754 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.483184 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.483215 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.483391 4740 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.483418 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.483430 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2969f94af2ae2bb1af3fa7e4547ad87c9fa0b3a503c46d52774766932196392c/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.484709 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.485108 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.492293 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.500977 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.501341 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.509775 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lddhg\" (UniqueName: \"kubernetes.io/projected/f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e-kube-api-access-lddhg\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.569620 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f16d930-1747-479f-9fd5-3e5683b1a6df\") pod \"rabbitmq-server-0\" (UID: \"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e\") " pod="openstack/rabbitmq-server-0" Jan 05 14:15:47 crc kubenswrapper[4740]: I0105 14:15:47.860843 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 05 14:15:48 crc kubenswrapper[4740]: I0105 14:15:48.365030 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 05 14:15:48 crc kubenswrapper[4740]: I0105 14:15:48.988202 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc8ef80-7c3c-4e61-8eaf-294ab4a75299" path="/var/lib/kubelet/pods/3bc8ef80-7c3c-4e61-8eaf-294ab4a75299/volumes" Jan 05 14:15:49 crc kubenswrapper[4740]: I0105 14:15:49.209588 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e","Type":"ContainerStarted","Data":"1591f14cce65a57511348bad2c9a8aa17572b91a6da1beb4b0256f5d258bab34"} Jan 05 14:15:51 crc kubenswrapper[4740]: I0105 14:15:51.237632 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e","Type":"ContainerStarted","Data":"5597548d9a6ed006854e3b4d3e4c0d533f09cf8920623386981513a81af4f680"} Jan 05 14:16:23 crc kubenswrapper[4740]: I0105 14:16:23.716385 4740 generic.go:334] "Generic (PLEG): container finished" podID="f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e" containerID="5597548d9a6ed006854e3b4d3e4c0d533f09cf8920623386981513a81af4f680" exitCode=0 Jan 05 14:16:23 crc kubenswrapper[4740]: I0105 14:16:23.717284 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e","Type":"ContainerDied","Data":"5597548d9a6ed006854e3b4d3e4c0d533f09cf8920623386981513a81af4f680"} Jan 05 14:16:24 crc kubenswrapper[4740]: I0105 14:16:24.729797 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e","Type":"ContainerStarted","Data":"90f5b0d91d1cfb12371ead922280ec7a778083d9fa0164b167d66c4e7e394278"} Jan 05 14:16:24 crc kubenswrapper[4740]: I0105 14:16:24.730191 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 05 14:16:28 crc kubenswrapper[4740]: I0105 14:16:28.971446 4740 scope.go:117] "RemoveContainer" containerID="e9cc1482edcbcb82406ec9648c0304d77917f947a6470b9aa5aadad1df652348" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.002775 4740 scope.go:117] "RemoveContainer" containerID="f9df0a1efe8e1fd5aa6bbc40c927e8d8827eea01e60c3d66bf2daaa80251bcdd" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.025847 4740 scope.go:117] "RemoveContainer" containerID="ec151a9ec49a8c4a5f947daa6a8a86ed246e271c624f6ccbece01f1324acdbef" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.053714 4740 scope.go:117] "RemoveContainer" containerID="756a0e2ca0c0323cc468c214fb467f66104f4b704d94c150379ff8f85da1a3bc" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.135233 4740 scope.go:117] "RemoveContainer" containerID="75ed23573953c0c6bdd2f2884fa5a323a3699a6921bd1ca83e8cf4e76851b2c1" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.183671 4740 scope.go:117] "RemoveContainer" containerID="b8fb83913b2a96a66327269f7aea014b20db1841f90afc12ae327e8d5ac758d3" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.216421 4740 scope.go:117] "RemoveContainer" containerID="37113591b12f42e7504421b7b942eb220d3cb9518e7bc8bd8578e9b1b53a28ac" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.238782 4740 scope.go:117] "RemoveContainer" containerID="8c90290b7aa487c361dec58ddaf0c839a114359312b9d1c83ec748212c1f3b0b" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.265042 4740 scope.go:117] "RemoveContainer" containerID="de7e0d44b9d033eede30158cf5b750ac579ec6dc07c237d4c01d036ff58d10fa" Jan 05 14:16:29 crc kubenswrapper[4740]: I0105 14:16:29.323539 4740 scope.go:117] "RemoveContainer" containerID="43b6b7c4584a0ece7548024dea78025680a0020db633f2707fc2e8fa25ba7e31" Jan 05 14:16:31 crc kubenswrapper[4740]: I0105 14:16:31.915578 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:16:31 crc kubenswrapper[4740]: I0105 14:16:31.916148 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:16:37 crc kubenswrapper[4740]: I0105 14:16:37.864264 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 05 14:16:37 crc kubenswrapper[4740]: I0105 14:16:37.898586 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.898566064 podStartE2EDuration="50.898566064s" podCreationTimestamp="2026-01-05 14:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:16:24.755172885 +0000 UTC m=+1634.062081474" watchObservedRunningTime="2026-01-05 14:16:37.898566064 +0000 UTC m=+1647.205474653" Jan 05 14:17:01 crc kubenswrapper[4740]: I0105 14:17:01.916400 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:17:01 crc kubenswrapper[4740]: I0105 14:17:01.916949 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:17:29 crc kubenswrapper[4740]: I0105 14:17:29.540835 4740 scope.go:117] "RemoveContainer" containerID="b81f88f084252f3f4c848a191096c8570b19f56426c0c005faa9322e001569fc" Jan 05 14:17:29 crc kubenswrapper[4740]: I0105 14:17:29.572585 4740 scope.go:117] "RemoveContainer" containerID="b26e21512e236401b079a1d8112670b3253bcf4216720cc81e8ff27ebe6cb1de" Jan 05 14:17:31 crc kubenswrapper[4740]: I0105 14:17:31.916396 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:17:31 crc kubenswrapper[4740]: I0105 14:17:31.916979 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:17:31 crc kubenswrapper[4740]: I0105 14:17:31.917043 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:17:31 crc kubenswrapper[4740]: I0105 14:17:31.918013 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:17:31 crc kubenswrapper[4740]: I0105 14:17:31.918143 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" gracePeriod=600 Jan 05 14:17:32 crc kubenswrapper[4740]: E0105 14:17:32.046719 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:17:32 crc kubenswrapper[4740]: I0105 14:17:32.659351 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" exitCode=0 Jan 05 14:17:32 crc kubenswrapper[4740]: I0105 14:17:32.659419 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5"} Jan 05 14:17:32 crc kubenswrapper[4740]: I0105 14:17:32.659487 4740 scope.go:117] "RemoveContainer" containerID="bcada73fec747c8bb22d39df02fd140c2ecd4b0b0dc04e0085ad7f13dab4ab07" Jan 05 14:17:32 crc kubenswrapper[4740]: I0105 14:17:32.660601 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:17:32 crc kubenswrapper[4740]: E0105 14:17:32.661519 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:17:45 crc kubenswrapper[4740]: I0105 14:17:45.968500 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:17:45 crc kubenswrapper[4740]: E0105 14:17:45.970916 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.085834 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fbfrn"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.106648 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wmlbx"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.120123 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-fbfrn"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.130673 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rm8zj"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.148774 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8281-account-create-update-fmlwx"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.163856 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-v4lk2"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.176209 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rm8zj"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.189213 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wmlbx"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.200615 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-98ff-account-create-update-48ls5"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.211992 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-v4lk2"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.223013 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8281-account-create-update-fmlwx"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.234056 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-98ff-account-create-update-48ls5"] Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.991042 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220854d5-c397-408e-b4f1-4f5c7a9ab8b2" path="/var/lib/kubelet/pods/220854d5-c397-408e-b4f1-4f5c7a9ab8b2/volumes" Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.992344 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236225ce-ccde-40de-a45f-c19e52ece918" path="/var/lib/kubelet/pods/236225ce-ccde-40de-a45f-c19e52ece918/volumes" Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.992952 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25773a1d-61fa-499b-a0b5-43dfdaafa421" path="/var/lib/kubelet/pods/25773a1d-61fa-499b-a0b5-43dfdaafa421/volumes" Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.993584 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77686b86-ebf6-47a1-9018-a32f1a089def" path="/var/lib/kubelet/pods/77686b86-ebf6-47a1-9018-a32f1a089def/volumes" Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.994806 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889" path="/var/lib/kubelet/pods/c2a5bb1e-c5e1-4dc8-b56d-e6fb2c615889/volumes" Jan 05 14:17:46 crc kubenswrapper[4740]: I0105 14:17:46.995517 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00" path="/var/lib/kubelet/pods/e8a07a26-5a7e-49b8-8445-bf1ce2bdbe00/volumes" Jan 05 14:17:47 crc kubenswrapper[4740]: I0105 14:17:47.061002 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5361-account-create-update-m2b2w"] Jan 05 14:17:47 crc kubenswrapper[4740]: I0105 14:17:47.082740 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-506a-account-create-update-w45h2"] Jan 05 14:17:47 crc kubenswrapper[4740]: I0105 14:17:47.098872 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5361-account-create-update-m2b2w"] Jan 05 14:17:47 crc kubenswrapper[4740]: I0105 14:17:47.113839 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-506a-account-create-update-w45h2"] Jan 05 14:17:48 crc kubenswrapper[4740]: I0105 14:17:48.984615 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63b41b33-b5bf-418b-b5c5-595e952f1380" path="/var/lib/kubelet/pods/63b41b33-b5bf-418b-b5c5-595e952f1380/volumes" Jan 05 14:17:48 crc kubenswrapper[4740]: I0105 14:17:48.986880 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2589e83-7430-45b3-95ad-a91a164948a6" path="/var/lib/kubelet/pods/c2589e83-7430-45b3-95ad-a91a164948a6/volumes" Jan 05 14:17:56 crc kubenswrapper[4740]: I0105 14:17:56.038495 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd"] Jan 05 14:17:56 crc kubenswrapper[4740]: I0105 14:17:56.050904 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-d171-account-create-update-4ckgq"] Jan 05 14:17:56 crc kubenswrapper[4740]: I0105 14:17:56.061290 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-d171-account-create-update-4ckgq"] Jan 05 14:17:56 crc kubenswrapper[4740]: I0105 14:17:56.071082 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sjtsd"] Jan 05 14:17:56 crc kubenswrapper[4740]: I0105 14:17:56.982044 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd13c104-1d69-4bc0-86f5-c9aad7c9010a" path="/var/lib/kubelet/pods/bd13c104-1d69-4bc0-86f5-c9aad7c9010a/volumes" Jan 05 14:17:56 crc kubenswrapper[4740]: I0105 14:17:56.983004 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb571d2d-b11f-47e3-855a-6b684dd3937e" path="/var/lib/kubelet/pods/cb571d2d-b11f-47e3-855a-6b684dd3937e/volumes" Jan 05 14:17:58 crc kubenswrapper[4740]: I0105 14:17:58.969794 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:17:58 crc kubenswrapper[4740]: E0105 14:17:58.971403 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:18:02 crc kubenswrapper[4740]: I0105 14:18:02.065378 4740 generic.go:334] "Generic (PLEG): container finished" podID="50749d1a-ca02-4a0c-8f55-522f5d53497a" containerID="e2284858d3f5dc879c6262a346e85b3537450b1a3799d9132bf0fd1418038c90" exitCode=0 Jan 05 14:18:02 crc kubenswrapper[4740]: I0105 14:18:02.065468 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" event={"ID":"50749d1a-ca02-4a0c-8f55-522f5d53497a","Type":"ContainerDied","Data":"e2284858d3f5dc879c6262a346e85b3537450b1a3799d9132bf0fd1418038c90"} Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.738562 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.897743 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvvfl\" (UniqueName: \"kubernetes.io/projected/50749d1a-ca02-4a0c-8f55-522f5d53497a-kube-api-access-nvvfl\") pod \"50749d1a-ca02-4a0c-8f55-522f5d53497a\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.898323 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-ssh-key\") pod \"50749d1a-ca02-4a0c-8f55-522f5d53497a\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.898467 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-bootstrap-combined-ca-bundle\") pod \"50749d1a-ca02-4a0c-8f55-522f5d53497a\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.898605 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-inventory\") pod \"50749d1a-ca02-4a0c-8f55-522f5d53497a\" (UID: \"50749d1a-ca02-4a0c-8f55-522f5d53497a\") " Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.905475 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50749d1a-ca02-4a0c-8f55-522f5d53497a-kube-api-access-nvvfl" (OuterVolumeSpecName: "kube-api-access-nvvfl") pod "50749d1a-ca02-4a0c-8f55-522f5d53497a" (UID: "50749d1a-ca02-4a0c-8f55-522f5d53497a"). InnerVolumeSpecName "kube-api-access-nvvfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.911445 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "50749d1a-ca02-4a0c-8f55-522f5d53497a" (UID: "50749d1a-ca02-4a0c-8f55-522f5d53497a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.941989 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50749d1a-ca02-4a0c-8f55-522f5d53497a" (UID: "50749d1a-ca02-4a0c-8f55-522f5d53497a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:18:03 crc kubenswrapper[4740]: I0105 14:18:03.968195 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-inventory" (OuterVolumeSpecName: "inventory") pod "50749d1a-ca02-4a0c-8f55-522f5d53497a" (UID: "50749d1a-ca02-4a0c-8f55-522f5d53497a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.002030 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.002285 4740 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.002348 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50749d1a-ca02-4a0c-8f55-522f5d53497a-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.002413 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvvfl\" (UniqueName: \"kubernetes.io/projected/50749d1a-ca02-4a0c-8f55-522f5d53497a-kube-api-access-nvvfl\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.149034 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" event={"ID":"50749d1a-ca02-4a0c-8f55-522f5d53497a","Type":"ContainerDied","Data":"5c6082a97b9122df4df7c5fead45ecfc361c09d05dd73b0aa3b5a294472fd9c8"} Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.149306 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c6082a97b9122df4df7c5fead45ecfc361c09d05dd73b0aa3b5a294472fd9c8" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.149361 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-997gx" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.196892 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2"] Jan 05 14:18:04 crc kubenswrapper[4740]: E0105 14:18:04.197429 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50749d1a-ca02-4a0c-8f55-522f5d53497a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.197448 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="50749d1a-ca02-4a0c-8f55-522f5d53497a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.197723 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="50749d1a-ca02-4a0c-8f55-522f5d53497a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.198544 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.209240 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2"] Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.235307 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.235406 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.235695 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.236128 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.308786 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.308844 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zmzv\" (UniqueName: \"kubernetes.io/projected/611be36b-14e8-4639-a2a5-f1ed357cfc34-kube-api-access-8zmzv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.308998 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.410818 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.410922 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zmzv\" (UniqueName: \"kubernetes.io/projected/611be36b-14e8-4639-a2a5-f1ed357cfc34-kube-api-access-8zmzv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.411100 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.415872 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.417887 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.428159 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zmzv\" (UniqueName: \"kubernetes.io/projected/611be36b-14e8-4639-a2a5-f1ed357cfc34-kube-api-access-8zmzv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:04 crc kubenswrapper[4740]: I0105 14:18:04.555763 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:18:05 crc kubenswrapper[4740]: I0105 14:18:05.169972 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2"] Jan 05 14:18:05 crc kubenswrapper[4740]: I0105 14:18:05.170971 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:18:06 crc kubenswrapper[4740]: I0105 14:18:06.179047 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" event={"ID":"611be36b-14e8-4639-a2a5-f1ed357cfc34","Type":"ContainerStarted","Data":"7437aafff5ca2ffcf6dcabdada5dccd14cb18e9df35036c080779716f3ae2a2e"} Jan 05 14:18:06 crc kubenswrapper[4740]: I0105 14:18:06.180229 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" event={"ID":"611be36b-14e8-4639-a2a5-f1ed357cfc34","Type":"ContainerStarted","Data":"91e2171aa5049e03449653ddc3b984f8deb5e39118e9e85a364d1ad56b8d73d1"} Jan 05 14:18:06 crc kubenswrapper[4740]: I0105 14:18:06.219367 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" podStartSLOduration=1.741677189 podStartE2EDuration="2.219341701s" podCreationTimestamp="2026-01-05 14:18:04 +0000 UTC" firstStartedPulling="2026-01-05 14:18:05.1701025 +0000 UTC m=+1734.477011099" lastFinishedPulling="2026-01-05 14:18:05.647766992 +0000 UTC m=+1734.954675611" observedRunningTime="2026-01-05 14:18:06.208426436 +0000 UTC m=+1735.515335025" watchObservedRunningTime="2026-01-05 14:18:06.219341701 +0000 UTC m=+1735.526250290" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.069543 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gpc72"] Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.084873 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gpc72"] Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.271616 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-njhh4"] Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.275740 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.293740 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njhh4"] Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.331939 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lht29\" (UniqueName: \"kubernetes.io/projected/4b0d8529-6e20-4227-82c7-f47ef37595ae-kube-api-access-lht29\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.332416 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-utilities\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.332565 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-catalog-content\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.433847 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-utilities\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.433890 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-catalog-content\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.434006 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lht29\" (UniqueName: \"kubernetes.io/projected/4b0d8529-6e20-4227-82c7-f47ef37595ae-kube-api-access-lht29\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.434452 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-utilities\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.434494 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-catalog-content\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.468086 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lht29\" (UniqueName: \"kubernetes.io/projected/4b0d8529-6e20-4227-82c7-f47ef37595ae-kube-api-access-lht29\") pod \"redhat-operators-njhh4\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.600934 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:11 crc kubenswrapper[4740]: I0105 14:18:11.969601 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:18:11 crc kubenswrapper[4740]: E0105 14:18:11.970269 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:18:12 crc kubenswrapper[4740]: I0105 14:18:12.290957 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njhh4"] Jan 05 14:18:12 crc kubenswrapper[4740]: I0105 14:18:12.985269 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25" path="/var/lib/kubelet/pods/adbfdbc4-e4a5-46ed-9f3e-ccf21e081e25/volumes" Jan 05 14:18:13 crc kubenswrapper[4740]: I0105 14:18:13.276207 4740 generic.go:334] "Generic (PLEG): container finished" podID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerID="2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0" exitCode=0 Jan 05 14:18:13 crc kubenswrapper[4740]: I0105 14:18:13.276250 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerDied","Data":"2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0"} Jan 05 14:18:13 crc kubenswrapper[4740]: I0105 14:18:13.276276 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerStarted","Data":"a95a5555296e45fede417f367c92e5db411c22d9c3436c924fde0be96715c297"} Jan 05 14:18:15 crc kubenswrapper[4740]: I0105 14:18:15.307628 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerStarted","Data":"a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f"} Jan 05 14:18:19 crc kubenswrapper[4740]: I0105 14:18:19.037293 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qmxnw"] Jan 05 14:18:19 crc kubenswrapper[4740]: I0105 14:18:19.048916 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qmxnw"] Jan 05 14:18:20 crc kubenswrapper[4740]: I0105 14:18:20.384415 4740 generic.go:334] "Generic (PLEG): container finished" podID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerID="a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f" exitCode=0 Jan 05 14:18:20 crc kubenswrapper[4740]: I0105 14:18:20.384528 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerDied","Data":"a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f"} Jan 05 14:18:21 crc kubenswrapper[4740]: I0105 14:18:21.011396 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1ed2b0-7aa0-43ad-a948-07e7b8de711f" path="/var/lib/kubelet/pods/de1ed2b0-7aa0-43ad-a948-07e7b8de711f/volumes" Jan 05 14:18:21 crc kubenswrapper[4740]: I0105 14:18:21.402328 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerStarted","Data":"44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06"} Jan 05 14:18:21 crc kubenswrapper[4740]: I0105 14:18:21.438681 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-njhh4" podStartSLOduration=2.734249734 podStartE2EDuration="10.438658295s" podCreationTimestamp="2026-01-05 14:18:11 +0000 UTC" firstStartedPulling="2026-01-05 14:18:13.278869234 +0000 UTC m=+1742.585777813" lastFinishedPulling="2026-01-05 14:18:20.983277795 +0000 UTC m=+1750.290186374" observedRunningTime="2026-01-05 14:18:21.420906135 +0000 UTC m=+1750.727814724" watchObservedRunningTime="2026-01-05 14:18:21.438658295 +0000 UTC m=+1750.745566894" Jan 05 14:18:21 crc kubenswrapper[4740]: I0105 14:18:21.601198 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:21 crc kubenswrapper[4740]: I0105 14:18:21.601382 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:22 crc kubenswrapper[4740]: I0105 14:18:22.659735 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-njhh4" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="registry-server" probeResult="failure" output=< Jan 05 14:18:22 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:18:22 crc kubenswrapper[4740]: > Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.041728 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pn5jk"] Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.057123 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mx79w"] Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.077819 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pn5jk"] Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.089447 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mx79w"] Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.098491 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0ef3-account-create-update-lqc7s"] Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.107723 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0ef3-account-create-update-lqc7s"] Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.982117 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2def1143-79d4-48ec-bae4-f0b620c8bc73" path="/var/lib/kubelet/pods/2def1143-79d4-48ec-bae4-f0b620c8bc73/volumes" Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.983216 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37129b31-a43d-412f-878e-72729d852124" path="/var/lib/kubelet/pods/37129b31-a43d-412f-878e-72729d852124/volumes" Jan 05 14:18:24 crc kubenswrapper[4740]: I0105 14:18:24.984329 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb21d30-fa05-490b-b1b1-cb0f73b8e149" path="/var/lib/kubelet/pods/edb21d30-fa05-490b-b1b1-cb0f73b8e149/volumes" Jan 05 14:18:25 crc kubenswrapper[4740]: I0105 14:18:25.968465 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:18:25 crc kubenswrapper[4740]: E0105 14:18:25.969207 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.048835 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-mljnv"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.062936 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d0fc-account-create-update-tc228"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.076316 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-e6c0-account-create-update-tr7lk"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.090229 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d0fc-account-create-update-tc228"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.101573 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-e6c0-account-create-update-tr7lk"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.112137 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-mljnv"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.121744 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0cc4-account-create-update-k9mmk"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.131323 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9k7k2"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.140890 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0cc4-account-create-update-k9mmk"] Jan 05 14:18:27 crc kubenswrapper[4740]: I0105 14:18:27.151638 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9k7k2"] Jan 05 14:18:28 crc kubenswrapper[4740]: I0105 14:18:28.983552 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90dd4a82-d074-47a2-a853-d7376f242c35" path="/var/lib/kubelet/pods/90dd4a82-d074-47a2-a853-d7376f242c35/volumes" Jan 05 14:18:28 crc kubenswrapper[4740]: I0105 14:18:28.984903 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1785e28-d1f5-4985-b080-4496c9b43c6e" path="/var/lib/kubelet/pods/a1785e28-d1f5-4985-b080-4496c9b43c6e/volumes" Jan 05 14:18:28 crc kubenswrapper[4740]: I0105 14:18:28.986387 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac07b17f-06ac-42c1-8e1f-49f2101f4774" path="/var/lib/kubelet/pods/ac07b17f-06ac-42c1-8e1f-49f2101f4774/volumes" Jan 05 14:18:28 crc kubenswrapper[4740]: I0105 14:18:28.987463 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e383395f-47a3-4209-9cdb-102a606d9ce2" path="/var/lib/kubelet/pods/e383395f-47a3-4209-9cdb-102a606d9ce2/volumes" Jan 05 14:18:28 crc kubenswrapper[4740]: I0105 14:18:28.989397 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9f4db3-55ef-4b94-8044-d317f3b1b760" path="/var/lib/kubelet/pods/ef9f4db3-55ef-4b94-8044-d317f3b1b760/volumes" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.693521 4740 scope.go:117] "RemoveContainer" containerID="681acffd14d38ee97ffaa04619d8ca2e02a1a221f2a07acb2863bbe1285ad428" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.773979 4740 scope.go:117] "RemoveContainer" containerID="e1e9b1fce0283d30a03b8f07252671b21169cb2d8c83348559c49139fb8c4386" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.799523 4740 scope.go:117] "RemoveContainer" containerID="81006f9e448378ee3c82b701651a413f1f481b235662dcd15a30d0ef724a6fef" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.823896 4740 scope.go:117] "RemoveContainer" containerID="174a7349cf3ac32a5b531c896aeb97cc51cf326a1acf7a4c3e76f0f836a180ca" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.852596 4740 scope.go:117] "RemoveContainer" containerID="8472ad756cad61d21dcee5cb12bc9cc73223e9f11a71ed3979e4ba5b12b239af" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.919297 4740 scope.go:117] "RemoveContainer" containerID="6782d68fdddbaa23a295d893e21ceacc93684ba92d9b7cb6d02530071bffa21e" Jan 05 14:18:29 crc kubenswrapper[4740]: I0105 14:18:29.998113 4740 scope.go:117] "RemoveContainer" containerID="cab7fe74becde430c00735e1d5b4e7fc02684c5356128c158c54dfc610667e3c" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.051890 4740 scope.go:117] "RemoveContainer" containerID="9bf43a34cde5a558352caf73d932ff7decda5087803faf196b44cdd9ea50b484" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.102498 4740 scope.go:117] "RemoveContainer" containerID="e2b4f37276b9fcdb3c16877c9626870f09549b6be3d7cb470634286a077a7336" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.158315 4740 scope.go:117] "RemoveContainer" containerID="1dfbcabc3dd9c2cf3091de5da1f5cb9da5db8b271d62adc2f1687d43ee91b281" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.179581 4740 scope.go:117] "RemoveContainer" containerID="81a3ee8545f2033c596c286d3ade79385f08d5f91f7b77d857a469f3eccbec3e" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.202631 4740 scope.go:117] "RemoveContainer" containerID="5543ab60f4ffc93cdbbd1acf5198b760b3d4c50c32afb6e037d5961782aac2d2" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.241627 4740 scope.go:117] "RemoveContainer" containerID="dabb4dba51eb13ff6f67a3f332098c2fe64c2d6f6b22d003a9c547e7d9f0f202" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.275685 4740 scope.go:117] "RemoveContainer" containerID="4807b7354a9fe83590199e8beee3abe136c43f6542e227fc14701977ca03c052" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.305456 4740 scope.go:117] "RemoveContainer" containerID="cd6b3b4eefadf7d19bd0fa339f92ed4fd32f5292fb84b9e63019f49df6acc14f" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.342544 4740 scope.go:117] "RemoveContainer" containerID="fb72cdd97defef4e104c59d230ecd53fcc97cff5342fd67e0da5c4fa2b298dec" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.393736 4740 scope.go:117] "RemoveContainer" containerID="2e79f741f4173c037d418f97c730da16b0bba6d15f5f73c4a6341b4a76f3ad1f" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.438176 4740 scope.go:117] "RemoveContainer" containerID="dc8565b5a104352f6fa87bc382b60ad7d5eced67a01b6fd19123cde70af1f693" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.477559 4740 scope.go:117] "RemoveContainer" containerID="4d165707cc32025d51288c8e4e521265721edec5401826803867494484f1c03d" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.508889 4740 scope.go:117] "RemoveContainer" containerID="cb16ffa83ce73d0993c3934e6a9a4e8966c568106bbcc5373f6eb798a9d4d788" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.534850 4740 scope.go:117] "RemoveContainer" containerID="1da9c538825a5a7615d10f0d492f1312807aac4394eabab3b468d19fc7303f25" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.559726 4740 scope.go:117] "RemoveContainer" containerID="9fd2217025cd011d3f47e68738af8a02300b10d3ffae26526c476014fb590ce2" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.585631 4740 scope.go:117] "RemoveContainer" containerID="982f84fd58e7b388c60b6cdcac99b85ba29bd739e28edaf988a6e07ba0d0b54c" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.614351 4740 scope.go:117] "RemoveContainer" containerID="1a3aaa76db1c68f48c71ebab38598b5d130c3cfd6ea367b37e2a17108f3d4720" Jan 05 14:18:30 crc kubenswrapper[4740]: I0105 14:18:30.636086 4740 scope.go:117] "RemoveContainer" containerID="344cc711f70384fa901eaf2e97f65aef64f354583b12adf8a71530c7ac23aee1" Jan 05 14:18:32 crc kubenswrapper[4740]: I0105 14:18:32.035806 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kzngh"] Jan 05 14:18:32 crc kubenswrapper[4740]: I0105 14:18:32.046916 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kzngh"] Jan 05 14:18:32 crc kubenswrapper[4740]: I0105 14:18:32.647237 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-njhh4" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="registry-server" probeResult="failure" output=< Jan 05 14:18:32 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:18:32 crc kubenswrapper[4740]: > Jan 05 14:18:32 crc kubenswrapper[4740]: I0105 14:18:32.983410 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa5ad76-5e26-4c24-8b3a-50ff2a182523" path="/var/lib/kubelet/pods/6aa5ad76-5e26-4c24-8b3a-50ff2a182523/volumes" Jan 05 14:18:40 crc kubenswrapper[4740]: I0105 14:18:40.976932 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:18:40 crc kubenswrapper[4740]: E0105 14:18:40.978162 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:18:41 crc kubenswrapper[4740]: I0105 14:18:41.689858 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:41 crc kubenswrapper[4740]: I0105 14:18:41.759654 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:42 crc kubenswrapper[4740]: I0105 14:18:42.489429 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-njhh4"] Jan 05 14:18:43 crc kubenswrapper[4740]: I0105 14:18:43.716485 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-njhh4" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="registry-server" containerID="cri-o://44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06" gracePeriod=2 Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.247961 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.424906 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-utilities\") pod \"4b0d8529-6e20-4227-82c7-f47ef37595ae\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.425296 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lht29\" (UniqueName: \"kubernetes.io/projected/4b0d8529-6e20-4227-82c7-f47ef37595ae-kube-api-access-lht29\") pod \"4b0d8529-6e20-4227-82c7-f47ef37595ae\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.425382 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-catalog-content\") pod \"4b0d8529-6e20-4227-82c7-f47ef37595ae\" (UID: \"4b0d8529-6e20-4227-82c7-f47ef37595ae\") " Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.425575 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-utilities" (OuterVolumeSpecName: "utilities") pod "4b0d8529-6e20-4227-82c7-f47ef37595ae" (UID: "4b0d8529-6e20-4227-82c7-f47ef37595ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.426287 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.431604 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0d8529-6e20-4227-82c7-f47ef37595ae-kube-api-access-lht29" (OuterVolumeSpecName: "kube-api-access-lht29") pod "4b0d8529-6e20-4227-82c7-f47ef37595ae" (UID: "4b0d8529-6e20-4227-82c7-f47ef37595ae"). InnerVolumeSpecName "kube-api-access-lht29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.528491 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lht29\" (UniqueName: \"kubernetes.io/projected/4b0d8529-6e20-4227-82c7-f47ef37595ae-kube-api-access-lht29\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.544647 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b0d8529-6e20-4227-82c7-f47ef37595ae" (UID: "4b0d8529-6e20-4227-82c7-f47ef37595ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.632271 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0d8529-6e20-4227-82c7-f47ef37595ae-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.733325 4740 generic.go:334] "Generic (PLEG): container finished" podID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerID="44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06" exitCode=0 Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.733372 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerDied","Data":"44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06"} Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.733401 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njhh4" event={"ID":"4b0d8529-6e20-4227-82c7-f47ef37595ae","Type":"ContainerDied","Data":"a95a5555296e45fede417f367c92e5db411c22d9c3436c924fde0be96715c297"} Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.733418 4740 scope.go:117] "RemoveContainer" containerID="44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.733576 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njhh4" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.775184 4740 scope.go:117] "RemoveContainer" containerID="a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.775752 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-njhh4"] Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.788940 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-njhh4"] Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.813225 4740 scope.go:117] "RemoveContainer" containerID="2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.856702 4740 scope.go:117] "RemoveContainer" containerID="44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06" Jan 05 14:18:44 crc kubenswrapper[4740]: E0105 14:18:44.857321 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06\": container with ID starting with 44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06 not found: ID does not exist" containerID="44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.857375 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06"} err="failed to get container status \"44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06\": rpc error: code = NotFound desc = could not find container \"44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06\": container with ID starting with 44c1edefb219111e92fe82d7539ba2068a29d1c92e23da6cf007ba474240eb06 not found: ID does not exist" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.857414 4740 scope.go:117] "RemoveContainer" containerID="a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f" Jan 05 14:18:44 crc kubenswrapper[4740]: E0105 14:18:44.857785 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f\": container with ID starting with a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f not found: ID does not exist" containerID="a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.857832 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f"} err="failed to get container status \"a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f\": rpc error: code = NotFound desc = could not find container \"a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f\": container with ID starting with a68cade31be8d3e081b2e226557b6540754a833ea9f98f2dc7eb2c6d0b87e39f not found: ID does not exist" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.857860 4740 scope.go:117] "RemoveContainer" containerID="2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0" Jan 05 14:18:44 crc kubenswrapper[4740]: E0105 14:18:44.858237 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0\": container with ID starting with 2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0 not found: ID does not exist" containerID="2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0" Jan 05 14:18:44 crc kubenswrapper[4740]: I0105 14:18:44.858280 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0"} err="failed to get container status \"2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0\": rpc error: code = NotFound desc = could not find container \"2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0\": container with ID starting with 2f54d0639d1fad82b2bcc1c4b01507f95997a06a41d7d5ecad1bbaff1f652cd0 not found: ID does not exist" Jan 05 14:18:45 crc kubenswrapper[4740]: I0105 14:18:45.012979 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" path="/var/lib/kubelet/pods/4b0d8529-6e20-4227-82c7-f47ef37595ae/volumes" Jan 05 14:18:52 crc kubenswrapper[4740]: I0105 14:18:52.970231 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:18:52 crc kubenswrapper[4740]: E0105 14:18:52.971299 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:19:03 crc kubenswrapper[4740]: I0105 14:19:03.056537 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rkvws"] Jan 05 14:19:03 crc kubenswrapper[4740]: I0105 14:19:03.067529 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rkvws"] Jan 05 14:19:03 crc kubenswrapper[4740]: I0105 14:19:03.970026 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:19:03 crc kubenswrapper[4740]: E0105 14:19:03.970646 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:19:04 crc kubenswrapper[4740]: I0105 14:19:04.980921 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a4f2a8-8fa7-44b6-a6bc-15531c720f24" path="/var/lib/kubelet/pods/60a4f2a8-8fa7-44b6-a6bc-15531c720f24/volumes" Jan 05 14:19:16 crc kubenswrapper[4740]: I0105 14:19:16.969674 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:19:16 crc kubenswrapper[4740]: E0105 14:19:16.972696 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:19:21 crc kubenswrapper[4740]: I0105 14:19:21.048163 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d6mcw"] Jan 05 14:19:21 crc kubenswrapper[4740]: I0105 14:19:21.064577 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-kghbd"] Jan 05 14:19:21 crc kubenswrapper[4740]: I0105 14:19:21.079834 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-kghbd"] Jan 05 14:19:21 crc kubenswrapper[4740]: I0105 14:19:21.088187 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d6mcw"] Jan 05 14:19:22 crc kubenswrapper[4740]: I0105 14:19:22.979734 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f3ba3c-1c87-449c-97e5-bfe06a37ef15" path="/var/lib/kubelet/pods/62f3ba3c-1c87-449c-97e5-bfe06a37ef15/volumes" Jan 05 14:19:22 crc kubenswrapper[4740]: I0105 14:19:22.980810 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7a41a0-75e3-4638-9856-90adffd28751" path="/var/lib/kubelet/pods/9b7a41a0-75e3-4638-9856-90adffd28751/volumes" Jan 05 14:19:27 crc kubenswrapper[4740]: I0105 14:19:27.054507 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-b2trl"] Jan 05 14:19:27 crc kubenswrapper[4740]: I0105 14:19:27.068057 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-b2trl"] Jan 05 14:19:28 crc kubenswrapper[4740]: I0105 14:19:28.987553 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401ab705-cb09-4760-840d-2b99b6c9148c" path="/var/lib/kubelet/pods/401ab705-cb09-4760-840d-2b99b6c9148c/volumes" Jan 05 14:19:30 crc kubenswrapper[4740]: I0105 14:19:30.990629 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:19:30 crc kubenswrapper[4740]: E0105 14:19:30.991821 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:19:31 crc kubenswrapper[4740]: I0105 14:19:31.177440 4740 scope.go:117] "RemoveContainer" containerID="eead28032c33e30b0db0e6f9a76d49d68f79d008b0f8d2505a89b617523bf27d" Jan 05 14:19:31 crc kubenswrapper[4740]: I0105 14:19:31.222973 4740 scope.go:117] "RemoveContainer" containerID="677464dd50895f81b81ed97f2d783b8a0c61200f9d99cb6d1b1d1bceafebb22f" Jan 05 14:19:31 crc kubenswrapper[4740]: I0105 14:19:31.283839 4740 scope.go:117] "RemoveContainer" containerID="acbed7293a2f8b258573fd52eb80726a34d8891f6ade563484839fa998b70910" Jan 05 14:19:31 crc kubenswrapper[4740]: I0105 14:19:31.342785 4740 scope.go:117] "RemoveContainer" containerID="f70e30aea270148875810e4804dc878ee6f263cb4476436356e31e95a3d7a23e" Jan 05 14:19:31 crc kubenswrapper[4740]: I0105 14:19:31.410716 4740 scope.go:117] "RemoveContainer" containerID="d4b65a82e39cd7c53f38459a66e71ae3e75f1a310110963ebacd44e27b5fd2a9" Jan 05 14:19:33 crc kubenswrapper[4740]: I0105 14:19:33.048678 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lzmfj"] Jan 05 14:19:33 crc kubenswrapper[4740]: I0105 14:19:33.064494 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lzmfj"] Jan 05 14:19:34 crc kubenswrapper[4740]: I0105 14:19:34.989016 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333dfe82-8fdd-400e-8b8c-89906d81e778" path="/var/lib/kubelet/pods/333dfe82-8fdd-400e-8b8c-89906d81e778/volumes" Jan 05 14:19:45 crc kubenswrapper[4740]: I0105 14:19:45.968918 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:19:45 crc kubenswrapper[4740]: E0105 14:19:45.969586 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:19:59 crc kubenswrapper[4740]: I0105 14:19:59.969316 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:19:59 crc kubenswrapper[4740]: E0105 14:19:59.970164 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:20:14 crc kubenswrapper[4740]: I0105 14:20:14.968910 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:20:14 crc kubenswrapper[4740]: E0105 14:20:14.969880 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:20:18 crc kubenswrapper[4740]: I0105 14:20:18.911436 4740 generic.go:334] "Generic (PLEG): container finished" podID="611be36b-14e8-4639-a2a5-f1ed357cfc34" containerID="7437aafff5ca2ffcf6dcabdada5dccd14cb18e9df35036c080779716f3ae2a2e" exitCode=0 Jan 05 14:20:18 crc kubenswrapper[4740]: I0105 14:20:18.911950 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" event={"ID":"611be36b-14e8-4639-a2a5-f1ed357cfc34","Type":"ContainerDied","Data":"7437aafff5ca2ffcf6dcabdada5dccd14cb18e9df35036c080779716f3ae2a2e"} Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.510457 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.662211 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-inventory\") pod \"611be36b-14e8-4639-a2a5-f1ed357cfc34\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.662279 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-ssh-key\") pod \"611be36b-14e8-4639-a2a5-f1ed357cfc34\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.662397 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zmzv\" (UniqueName: \"kubernetes.io/projected/611be36b-14e8-4639-a2a5-f1ed357cfc34-kube-api-access-8zmzv\") pod \"611be36b-14e8-4639-a2a5-f1ed357cfc34\" (UID: \"611be36b-14e8-4639-a2a5-f1ed357cfc34\") " Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.676520 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611be36b-14e8-4639-a2a5-f1ed357cfc34-kube-api-access-8zmzv" (OuterVolumeSpecName: "kube-api-access-8zmzv") pod "611be36b-14e8-4639-a2a5-f1ed357cfc34" (UID: "611be36b-14e8-4639-a2a5-f1ed357cfc34"). InnerVolumeSpecName "kube-api-access-8zmzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.718049 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "611be36b-14e8-4639-a2a5-f1ed357cfc34" (UID: "611be36b-14e8-4639-a2a5-f1ed357cfc34"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.719522 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-inventory" (OuterVolumeSpecName: "inventory") pod "611be36b-14e8-4639-a2a5-f1ed357cfc34" (UID: "611be36b-14e8-4639-a2a5-f1ed357cfc34"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.766364 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.766422 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/611be36b-14e8-4639-a2a5-f1ed357cfc34-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.766443 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zmzv\" (UniqueName: \"kubernetes.io/projected/611be36b-14e8-4639-a2a5-f1ed357cfc34-kube-api-access-8zmzv\") on node \"crc\" DevicePath \"\"" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.941105 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" event={"ID":"611be36b-14e8-4639-a2a5-f1ed357cfc34","Type":"ContainerDied","Data":"91e2171aa5049e03449653ddc3b984f8deb5e39118e9e85a364d1ad56b8d73d1"} Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.941147 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e2171aa5049e03449653ddc3b984f8deb5e39118e9e85a364d1ad56b8d73d1" Jan 05 14:20:20 crc kubenswrapper[4740]: I0105 14:20:20.941207 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.041652 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk"] Jan 05 14:20:21 crc kubenswrapper[4740]: E0105 14:20:21.042229 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="registry-server" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.042247 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="registry-server" Jan 05 14:20:21 crc kubenswrapper[4740]: E0105 14:20:21.042257 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="extract-content" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.042263 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="extract-content" Jan 05 14:20:21 crc kubenswrapper[4740]: E0105 14:20:21.042271 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="extract-utilities" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.042276 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="extract-utilities" Jan 05 14:20:21 crc kubenswrapper[4740]: E0105 14:20:21.042305 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611be36b-14e8-4639-a2a5-f1ed357cfc34" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.042313 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="611be36b-14e8-4639-a2a5-f1ed357cfc34" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.042544 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0d8529-6e20-4227-82c7-f47ef37595ae" containerName="registry-server" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.042580 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="611be36b-14e8-4639-a2a5-f1ed357cfc34" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.043379 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.045963 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.046129 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.046322 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.046443 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.053646 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk"] Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.176269 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp2gf\" (UniqueName: \"kubernetes.io/projected/dc645406-ab7f-4676-a442-373f2251f8d1-kube-api-access-bp2gf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.176965 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.177084 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.279994 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp2gf\" (UniqueName: \"kubernetes.io/projected/dc645406-ab7f-4676-a442-373f2251f8d1-kube-api-access-bp2gf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.280116 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.280144 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.284426 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.286460 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.298042 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp2gf\" (UniqueName: \"kubernetes.io/projected/dc645406-ab7f-4676-a442-373f2251f8d1-kube-api-access-bp2gf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.410640 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:20:21 crc kubenswrapper[4740]: I0105 14:20:21.967071 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk"] Jan 05 14:20:21 crc kubenswrapper[4740]: W0105 14:20:21.967522 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc645406_ab7f_4676_a442_373f2251f8d1.slice/crio-8a279871e06b07684cc4200e8cd7865b898c696ab85fe08e5efaf351c343085b WatchSource:0}: Error finding container 8a279871e06b07684cc4200e8cd7865b898c696ab85fe08e5efaf351c343085b: Status 404 returned error can't find the container with id 8a279871e06b07684cc4200e8cd7865b898c696ab85fe08e5efaf351c343085b Jan 05 14:20:22 crc kubenswrapper[4740]: I0105 14:20:22.043809 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4928-account-create-update-h89c7"] Jan 05 14:20:22 crc kubenswrapper[4740]: I0105 14:20:22.055355 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4928-account-create-update-h89c7"] Jan 05 14:20:22 crc kubenswrapper[4740]: I0105 14:20:22.993872 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6785a14-5ddd-471f-ac7d-eed48bfacd44" path="/var/lib/kubelet/pods/a6785a14-5ddd-471f-ac7d-eed48bfacd44/volumes" Jan 05 14:20:22 crc kubenswrapper[4740]: I0105 14:20:22.994901 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" event={"ID":"dc645406-ab7f-4676-a442-373f2251f8d1","Type":"ContainerStarted","Data":"8a279871e06b07684cc4200e8cd7865b898c696ab85fe08e5efaf351c343085b"} Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.018911 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" event={"ID":"dc645406-ab7f-4676-a442-373f2251f8d1","Type":"ContainerStarted","Data":"31798230d7642ce34a3036b82b522095d735f205b6518055d6607e9f623fdb9a"} Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.054429 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9hbx9"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.072777 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5490-account-create-update-mq5tj"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.083196 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6419-account-create-update-fgr2l"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.094737 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pgxvc"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.111580 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9hbx9"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.125187 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5490-account-create-update-mq5tj"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.135854 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6419-account-create-update-fgr2l"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.144251 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-z6tgb"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.158780 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-z6tgb"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.168663 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pgxvc"] Jan 05 14:20:25 crc kubenswrapper[4740]: I0105 14:20:25.175197 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" podStartSLOduration=3.514428277 podStartE2EDuration="4.175171984s" podCreationTimestamp="2026-01-05 14:20:21 +0000 UTC" firstStartedPulling="2026-01-05 14:20:21.970312548 +0000 UTC m=+1871.277221137" lastFinishedPulling="2026-01-05 14:20:22.631056255 +0000 UTC m=+1871.937964844" observedRunningTime="2026-01-05 14:20:25.049365236 +0000 UTC m=+1874.356273815" watchObservedRunningTime="2026-01-05 14:20:25.175171984 +0000 UTC m=+1874.482080583" Jan 05 14:20:26 crc kubenswrapper[4740]: I0105 14:20:26.968982 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:20:26 crc kubenswrapper[4740]: E0105 14:20:26.969864 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:20:26 crc kubenswrapper[4740]: I0105 14:20:26.985693 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8e97e5-739e-4c24-9768-e6a1fb56bace" path="/var/lib/kubelet/pods/0b8e97e5-739e-4c24-9768-e6a1fb56bace/volumes" Jan 05 14:20:26 crc kubenswrapper[4740]: I0105 14:20:26.986729 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef0a3cf-3805-44a3-9c13-601726874901" path="/var/lib/kubelet/pods/2ef0a3cf-3805-44a3-9c13-601726874901/volumes" Jan 05 14:20:26 crc kubenswrapper[4740]: I0105 14:20:26.987812 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386101d7-0cc9-471d-8fe8-95af3e74a121" path="/var/lib/kubelet/pods/386101d7-0cc9-471d-8fe8-95af3e74a121/volumes" Jan 05 14:20:26 crc kubenswrapper[4740]: I0105 14:20:26.988728 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cadffc7b-e58f-4f7e-b312-d477c2ad9429" path="/var/lib/kubelet/pods/cadffc7b-e58f-4f7e-b312-d477c2ad9429/volumes" Jan 05 14:20:26 crc kubenswrapper[4740]: I0105 14:20:26.990400 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21239f6-f726-4394-b6fe-f7f7f438d7b5" path="/var/lib/kubelet/pods/e21239f6-f726-4394-b6fe-f7f7f438d7b5/volumes" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.617780 4740 scope.go:117] "RemoveContainer" containerID="a0299e56f0e20b1923f748e3cc6051eddfbc969c7a41b8c3b26e0b30dce6a0b1" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.650472 4740 scope.go:117] "RemoveContainer" containerID="0a28a0e3d6c7e4ed890c9f0ebacacae40f2aea3bd5e997a9c096805e6bd1464a" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.709663 4740 scope.go:117] "RemoveContainer" containerID="c0afb2f6f3a91e38cbf0e574ec430518cd4fb41ef4c26143e727965aa21a7f8e" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.764295 4740 scope.go:117] "RemoveContainer" containerID="76bdb405b0441aa360ffd3974db41ffab710eb9b38e2e849c1d97b36aea7789a" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.841554 4740 scope.go:117] "RemoveContainer" containerID="ac86b8bb7b7f5ab0da899d8fc14c0d261246b98bf618c7a204bf60d3aebd5dd5" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.895624 4740 scope.go:117] "RemoveContainer" containerID="c83059239fd95d166b17faaba84439fae07290c974d4b57ab0b59c49c0438980" Jan 05 14:20:31 crc kubenswrapper[4740]: I0105 14:20:31.939865 4740 scope.go:117] "RemoveContainer" containerID="5086cd772cc74834eb4e8205b880c36e422e72de4ffb653d932b87b19b168239" Jan 05 14:20:38 crc kubenswrapper[4740]: I0105 14:20:38.969698 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:20:38 crc kubenswrapper[4740]: E0105 14:20:38.970993 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:20:51 crc kubenswrapper[4740]: I0105 14:20:51.970764 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:20:51 crc kubenswrapper[4740]: E0105 14:20:51.972265 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:20:59 crc kubenswrapper[4740]: I0105 14:20:59.043292 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lsnkl"] Jan 05 14:20:59 crc kubenswrapper[4740]: I0105 14:20:59.054619 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lsnkl"] Jan 05 14:21:00 crc kubenswrapper[4740]: I0105 14:21:00.987011 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231fcf86-a13a-4dce-b991-609c503de1ec" path="/var/lib/kubelet/pods/231fcf86-a13a-4dce-b991-609c503de1ec/volumes" Jan 05 14:21:03 crc kubenswrapper[4740]: I0105 14:21:03.968406 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:21:03 crc kubenswrapper[4740]: E0105 14:21:03.969408 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:21:18 crc kubenswrapper[4740]: I0105 14:21:18.030321 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-g9xk2"] Jan 05 14:21:18 crc kubenswrapper[4740]: I0105 14:21:18.042236 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-g9xk2"] Jan 05 14:21:18 crc kubenswrapper[4740]: I0105 14:21:18.972044 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:21:18 crc kubenswrapper[4740]: E0105 14:21:18.972429 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:21:18 crc kubenswrapper[4740]: I0105 14:21:18.992814 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb" path="/var/lib/kubelet/pods/2e5db2b6-d22f-43eb-9a2a-7fa00d9b9cbb/volumes" Jan 05 14:21:22 crc kubenswrapper[4740]: I0105 14:21:22.048618 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-25blk"] Jan 05 14:21:22 crc kubenswrapper[4740]: I0105 14:21:22.059428 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-25blk"] Jan 05 14:21:22 crc kubenswrapper[4740]: I0105 14:21:22.981383 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb2618b-5bb1-4caf-a6f9-708cddcbf390" path="/var/lib/kubelet/pods/6eb2618b-5bb1-4caf-a6f9-708cddcbf390/volumes" Jan 05 14:21:26 crc kubenswrapper[4740]: I0105 14:21:26.034460 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-b00e-account-create-update-l22g4"] Jan 05 14:21:26 crc kubenswrapper[4740]: I0105 14:21:26.054198 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-b00e-account-create-update-l22g4"] Jan 05 14:21:26 crc kubenswrapper[4740]: I0105 14:21:26.068283 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-c5pxn"] Jan 05 14:21:26 crc kubenswrapper[4740]: I0105 14:21:26.078727 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-c5pxn"] Jan 05 14:21:26 crc kubenswrapper[4740]: I0105 14:21:26.983501 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac66a050-f6b1-49d7-8043-6bcd01940889" path="/var/lib/kubelet/pods/ac66a050-f6b1-49d7-8043-6bcd01940889/volumes" Jan 05 14:21:26 crc kubenswrapper[4740]: I0105 14:21:26.984606 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54ed0de-5ddc-460f-a94e-f8223e8d36e9" path="/var/lib/kubelet/pods/d54ed0de-5ddc-460f-a94e-f8223e8d36e9/volumes" Jan 05 14:21:31 crc kubenswrapper[4740]: I0105 14:21:31.968194 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:21:31 crc kubenswrapper[4740]: E0105 14:21:31.969063 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:21:32 crc kubenswrapper[4740]: I0105 14:21:32.133602 4740 scope.go:117] "RemoveContainer" containerID="55e7651216719090d790e99574bd370d4cc43c269d7cae4fb327bf843c608baf" Jan 05 14:21:32 crc kubenswrapper[4740]: I0105 14:21:32.181782 4740 scope.go:117] "RemoveContainer" containerID="5972087ebfa046e8b0d8dbf9630de93aebaadc4ede2f0ea6a5d4e8ee0499d9d7" Jan 05 14:21:32 crc kubenswrapper[4740]: I0105 14:21:32.265184 4740 scope.go:117] "RemoveContainer" containerID="83a35e5626bd0af0822fc7d563c2a07eae40ef5243d0cecbe9a448d003703d5a" Jan 05 14:21:32 crc kubenswrapper[4740]: I0105 14:21:32.317967 4740 scope.go:117] "RemoveContainer" containerID="23d7a406a4470fe12f4b1940ed029315efec014f896a337fbfbcd31ed88a4b2f" Jan 05 14:21:32 crc kubenswrapper[4740]: I0105 14:21:32.373160 4740 scope.go:117] "RemoveContainer" containerID="a422b28e96fb5cde02201a4137a02f228c697a541da763888ce490fd33b1b8fb" Jan 05 14:21:41 crc kubenswrapper[4740]: I0105 14:21:41.040678 4740 generic.go:334] "Generic (PLEG): container finished" podID="dc645406-ab7f-4676-a442-373f2251f8d1" containerID="31798230d7642ce34a3036b82b522095d735f205b6518055d6607e9f623fdb9a" exitCode=0 Jan 05 14:21:41 crc kubenswrapper[4740]: I0105 14:21:41.040777 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" event={"ID":"dc645406-ab7f-4676-a442-373f2251f8d1","Type":"ContainerDied","Data":"31798230d7642ce34a3036b82b522095d735f205b6518055d6607e9f623fdb9a"} Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.630981 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.800367 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-ssh-key\") pod \"dc645406-ab7f-4676-a442-373f2251f8d1\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.800417 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-inventory\") pod \"dc645406-ab7f-4676-a442-373f2251f8d1\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.800491 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp2gf\" (UniqueName: \"kubernetes.io/projected/dc645406-ab7f-4676-a442-373f2251f8d1-kube-api-access-bp2gf\") pod \"dc645406-ab7f-4676-a442-373f2251f8d1\" (UID: \"dc645406-ab7f-4676-a442-373f2251f8d1\") " Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.820362 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc645406-ab7f-4676-a442-373f2251f8d1-kube-api-access-bp2gf" (OuterVolumeSpecName: "kube-api-access-bp2gf") pod "dc645406-ab7f-4676-a442-373f2251f8d1" (UID: "dc645406-ab7f-4676-a442-373f2251f8d1"). InnerVolumeSpecName "kube-api-access-bp2gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.834349 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc645406-ab7f-4676-a442-373f2251f8d1" (UID: "dc645406-ab7f-4676-a442-373f2251f8d1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.844196 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-inventory" (OuterVolumeSpecName: "inventory") pod "dc645406-ab7f-4676-a442-373f2251f8d1" (UID: "dc645406-ab7f-4676-a442-373f2251f8d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.904342 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.904368 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc645406-ab7f-4676-a442-373f2251f8d1-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:21:42 crc kubenswrapper[4740]: I0105 14:21:42.904378 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp2gf\" (UniqueName: \"kubernetes.io/projected/dc645406-ab7f-4676-a442-373f2251f8d1-kube-api-access-bp2gf\") on node \"crc\" DevicePath \"\"" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.067895 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" event={"ID":"dc645406-ab7f-4676-a442-373f2251f8d1","Type":"ContainerDied","Data":"8a279871e06b07684cc4200e8cd7865b898c696ab85fe08e5efaf351c343085b"} Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.067937 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a279871e06b07684cc4200e8cd7865b898c696ab85fe08e5efaf351c343085b" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.067940 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.155736 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn"] Jan 05 14:21:43 crc kubenswrapper[4740]: E0105 14:21:43.156608 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc645406-ab7f-4676-a442-373f2251f8d1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.156695 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc645406-ab7f-4676-a442-373f2251f8d1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.157128 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc645406-ab7f-4676-a442-373f2251f8d1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.158219 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.163600 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.163719 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.164031 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.164623 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.180409 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn"] Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.315007 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.315280 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.315599 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hlg\" (UniqueName: \"kubernetes.io/projected/8f16ba41-c679-41ca-80e4-508f95ff78ca-kube-api-access-c4hlg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.420021 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.420475 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.420588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hlg\" (UniqueName: \"kubernetes.io/projected/8f16ba41-c679-41ca-80e4-508f95ff78ca-kube-api-access-c4hlg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.425764 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.431937 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.448289 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hlg\" (UniqueName: \"kubernetes.io/projected/8f16ba41-c679-41ca-80e4-508f95ff78ca-kube-api-access-c4hlg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:43 crc kubenswrapper[4740]: I0105 14:21:43.486558 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:44 crc kubenswrapper[4740]: I0105 14:21:44.290436 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn"] Jan 05 14:21:45 crc kubenswrapper[4740]: I0105 14:21:45.099373 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" event={"ID":"8f16ba41-c679-41ca-80e4-508f95ff78ca","Type":"ContainerStarted","Data":"3a99aad7486736996399f77266f58519e1c161978b6b1b96b548bf93ccf0ae94"} Jan 05 14:21:46 crc kubenswrapper[4740]: I0105 14:21:46.112208 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" event={"ID":"8f16ba41-c679-41ca-80e4-508f95ff78ca","Type":"ContainerStarted","Data":"d48037be47e44ba0f71e9ce225a534174a9611d8d60a446a9556587e36480ce5"} Jan 05 14:21:46 crc kubenswrapper[4740]: I0105 14:21:46.136170 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" podStartSLOduration=2.519620701 podStartE2EDuration="3.136142195s" podCreationTimestamp="2026-01-05 14:21:43 +0000 UTC" firstStartedPulling="2026-01-05 14:21:44.306276898 +0000 UTC m=+1953.613185477" lastFinishedPulling="2026-01-05 14:21:44.922798392 +0000 UTC m=+1954.229706971" observedRunningTime="2026-01-05 14:21:46.1289541 +0000 UTC m=+1955.435862719" watchObservedRunningTime="2026-01-05 14:21:46.136142195 +0000 UTC m=+1955.443050774" Jan 05 14:21:46 crc kubenswrapper[4740]: I0105 14:21:46.969400 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:21:46 crc kubenswrapper[4740]: E0105 14:21:46.970041 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:21:52 crc kubenswrapper[4740]: I0105 14:21:52.177354 4740 generic.go:334] "Generic (PLEG): container finished" podID="8f16ba41-c679-41ca-80e4-508f95ff78ca" containerID="d48037be47e44ba0f71e9ce225a534174a9611d8d60a446a9556587e36480ce5" exitCode=0 Jan 05 14:21:52 crc kubenswrapper[4740]: I0105 14:21:52.177406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" event={"ID":"8f16ba41-c679-41ca-80e4-508f95ff78ca","Type":"ContainerDied","Data":"d48037be47e44ba0f71e9ce225a534174a9611d8d60a446a9556587e36480ce5"} Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.811734 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.902492 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-inventory\") pod \"8f16ba41-c679-41ca-80e4-508f95ff78ca\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.903472 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-ssh-key\") pod \"8f16ba41-c679-41ca-80e4-508f95ff78ca\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.903674 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hlg\" (UniqueName: \"kubernetes.io/projected/8f16ba41-c679-41ca-80e4-508f95ff78ca-kube-api-access-c4hlg\") pod \"8f16ba41-c679-41ca-80e4-508f95ff78ca\" (UID: \"8f16ba41-c679-41ca-80e4-508f95ff78ca\") " Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.910195 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f16ba41-c679-41ca-80e4-508f95ff78ca-kube-api-access-c4hlg" (OuterVolumeSpecName: "kube-api-access-c4hlg") pod "8f16ba41-c679-41ca-80e4-508f95ff78ca" (UID: "8f16ba41-c679-41ca-80e4-508f95ff78ca"). InnerVolumeSpecName "kube-api-access-c4hlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.937507 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-inventory" (OuterVolumeSpecName: "inventory") pod "8f16ba41-c679-41ca-80e4-508f95ff78ca" (UID: "8f16ba41-c679-41ca-80e4-508f95ff78ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:21:53 crc kubenswrapper[4740]: I0105 14:21:53.941331 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f16ba41-c679-41ca-80e4-508f95ff78ca" (UID: "8f16ba41-c679-41ca-80e4-508f95ff78ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.006890 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.006933 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hlg\" (UniqueName: \"kubernetes.io/projected/8f16ba41-c679-41ca-80e4-508f95ff78ca-kube-api-access-c4hlg\") on node \"crc\" DevicePath \"\"" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.006946 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f16ba41-c679-41ca-80e4-508f95ff78ca-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.207161 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" event={"ID":"8f16ba41-c679-41ca-80e4-508f95ff78ca","Type":"ContainerDied","Data":"3a99aad7486736996399f77266f58519e1c161978b6b1b96b548bf93ccf0ae94"} Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.207383 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a99aad7486736996399f77266f58519e1c161978b6b1b96b548bf93ccf0ae94" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.207514 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.272590 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g"] Jan 05 14:21:54 crc kubenswrapper[4740]: E0105 14:21:54.273444 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f16ba41-c679-41ca-80e4-508f95ff78ca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.273474 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f16ba41-c679-41ca-80e4-508f95ff78ca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.273946 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f16ba41-c679-41ca-80e4-508f95ff78ca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.275457 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.278846 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.279077 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.279363 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.279988 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.305339 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g"] Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.327410 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbw7f\" (UniqueName: \"kubernetes.io/projected/fd7ca708-94a1-4016-a666-8a9b1eb34a62-kube-api-access-pbw7f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.327595 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.328032 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.429677 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.429827 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.429905 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbw7f\" (UniqueName: \"kubernetes.io/projected/fd7ca708-94a1-4016-a666-8a9b1eb34a62-kube-api-access-pbw7f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.435197 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.440226 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.446080 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbw7f\" (UniqueName: \"kubernetes.io/projected/fd7ca708-94a1-4016-a666-8a9b1eb34a62-kube-api-access-pbw7f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xtb2g\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:54 crc kubenswrapper[4740]: I0105 14:21:54.606459 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:21:55 crc kubenswrapper[4740]: I0105 14:21:55.228858 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g"] Jan 05 14:21:55 crc kubenswrapper[4740]: I0105 14:21:55.991649 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5t8w"] Jan 05 14:21:55 crc kubenswrapper[4740]: I0105 14:21:55.995614 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.004996 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5t8w"] Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.078922 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-catalog-content\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.078979 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdc4\" (UniqueName: \"kubernetes.io/projected/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-kube-api-access-2zdc4\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.079001 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-utilities\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.181209 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-catalog-content\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.181264 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdc4\" (UniqueName: \"kubernetes.io/projected/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-kube-api-access-2zdc4\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.181307 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-utilities\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.181730 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-catalog-content\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.181801 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-utilities\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.217573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdc4\" (UniqueName: \"kubernetes.io/projected/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-kube-api-access-2zdc4\") pod \"certified-operators-w5t8w\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.237101 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" event={"ID":"fd7ca708-94a1-4016-a666-8a9b1eb34a62","Type":"ContainerStarted","Data":"11ee6c215e1d500d721a8090f406d65800100508142accafb7491a4a887fd092"} Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.237142 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" event={"ID":"fd7ca708-94a1-4016-a666-8a9b1eb34a62","Type":"ContainerStarted","Data":"06f6add547dcd7e0e4c633aab25310a72d03e8d5c184fa324c8f3c91625613a3"} Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.256971 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" podStartSLOduration=1.632955644 podStartE2EDuration="2.256953182s" podCreationTimestamp="2026-01-05 14:21:54 +0000 UTC" firstStartedPulling="2026-01-05 14:21:55.232088094 +0000 UTC m=+1964.538996683" lastFinishedPulling="2026-01-05 14:21:55.856085642 +0000 UTC m=+1965.162994221" observedRunningTime="2026-01-05 14:21:56.254135516 +0000 UTC m=+1965.561044095" watchObservedRunningTime="2026-01-05 14:21:56.256953182 +0000 UTC m=+1965.563861761" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.351206 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:21:56 crc kubenswrapper[4740]: I0105 14:21:56.893510 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5t8w"] Jan 05 14:21:56 crc kubenswrapper[4740]: W0105 14:21:56.894838 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcffe6fd6_ea8b_4919_b91a_b80de69cd3c7.slice/crio-c45e4a71582857a31e3beac5bf4c78345216f1ae8c47018dcfd7b351473c6694 WatchSource:0}: Error finding container c45e4a71582857a31e3beac5bf4c78345216f1ae8c47018dcfd7b351473c6694: Status 404 returned error can't find the container with id c45e4a71582857a31e3beac5bf4c78345216f1ae8c47018dcfd7b351473c6694 Jan 05 14:21:57 crc kubenswrapper[4740]: I0105 14:21:57.247464 4740 generic.go:334] "Generic (PLEG): container finished" podID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerID="79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98" exitCode=0 Jan 05 14:21:57 crc kubenswrapper[4740]: I0105 14:21:57.249004 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerDied","Data":"79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98"} Jan 05 14:21:57 crc kubenswrapper[4740]: I0105 14:21:57.249030 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerStarted","Data":"c45e4a71582857a31e3beac5bf4c78345216f1ae8c47018dcfd7b351473c6694"} Jan 05 14:21:58 crc kubenswrapper[4740]: I0105 14:21:58.260241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerStarted","Data":"706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007"} Jan 05 14:21:58 crc kubenswrapper[4740]: I0105 14:21:58.968287 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:21:58 crc kubenswrapper[4740]: E0105 14:21:58.968865 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:22:00 crc kubenswrapper[4740]: I0105 14:22:00.289042 4740 generic.go:334] "Generic (PLEG): container finished" podID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerID="706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007" exitCode=0 Jan 05 14:22:00 crc kubenswrapper[4740]: I0105 14:22:00.289177 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerDied","Data":"706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007"} Jan 05 14:22:01 crc kubenswrapper[4740]: I0105 14:22:01.302858 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerStarted","Data":"a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48"} Jan 05 14:22:01 crc kubenswrapper[4740]: I0105 14:22:01.325749 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5t8w" podStartSLOduration=2.727795588 podStartE2EDuration="6.325730363s" podCreationTimestamp="2026-01-05 14:21:55 +0000 UTC" firstStartedPulling="2026-01-05 14:21:57.251907397 +0000 UTC m=+1966.558815976" lastFinishedPulling="2026-01-05 14:22:00.849842172 +0000 UTC m=+1970.156750751" observedRunningTime="2026-01-05 14:22:01.319544975 +0000 UTC m=+1970.626453544" watchObservedRunningTime="2026-01-05 14:22:01.325730363 +0000 UTC m=+1970.632638942" Jan 05 14:22:04 crc kubenswrapper[4740]: I0105 14:22:04.049744 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwvss"] Jan 05 14:22:04 crc kubenswrapper[4740]: I0105 14:22:04.095738 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwvss"] Jan 05 14:22:04 crc kubenswrapper[4740]: I0105 14:22:04.988304 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ab0150-0aef-4e09-b54b-ff6201665b33" path="/var/lib/kubelet/pods/37ab0150-0aef-4e09-b54b-ff6201665b33/volumes" Jan 05 14:22:06 crc kubenswrapper[4740]: I0105 14:22:06.351586 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:22:06 crc kubenswrapper[4740]: I0105 14:22:06.351658 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:22:06 crc kubenswrapper[4740]: I0105 14:22:06.445978 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:22:06 crc kubenswrapper[4740]: I0105 14:22:06.523383 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:22:06 crc kubenswrapper[4740]: I0105 14:22:06.699983 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5t8w"] Jan 05 14:22:08 crc kubenswrapper[4740]: I0105 14:22:08.394839 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5t8w" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="registry-server" containerID="cri-o://a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48" gracePeriod=2 Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.033740 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.172187 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdc4\" (UniqueName: \"kubernetes.io/projected/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-kube-api-access-2zdc4\") pod \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.172384 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-catalog-content\") pod \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.172418 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-utilities\") pod \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\" (UID: \"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7\") " Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.173542 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-utilities" (OuterVolumeSpecName: "utilities") pod "cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" (UID: "cffe6fd6-ea8b-4919-b91a-b80de69cd3c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.185617 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-kube-api-access-2zdc4" (OuterVolumeSpecName: "kube-api-access-2zdc4") pod "cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" (UID: "cffe6fd6-ea8b-4919-b91a-b80de69cd3c7"). InnerVolumeSpecName "kube-api-access-2zdc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.224193 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" (UID: "cffe6fd6-ea8b-4919-b91a-b80de69cd3c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.275600 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.276076 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.276165 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdc4\" (UniqueName: \"kubernetes.io/projected/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7-kube-api-access-2zdc4\") on node \"crc\" DevicePath \"\"" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.407092 4740 generic.go:334] "Generic (PLEG): container finished" podID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerID="a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48" exitCode=0 Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.407143 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5t8w" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.407147 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerDied","Data":"a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48"} Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.407247 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5t8w" event={"ID":"cffe6fd6-ea8b-4919-b91a-b80de69cd3c7","Type":"ContainerDied","Data":"c45e4a71582857a31e3beac5bf4c78345216f1ae8c47018dcfd7b351473c6694"} Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.407271 4740 scope.go:117] "RemoveContainer" containerID="a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.429300 4740 scope.go:117] "RemoveContainer" containerID="706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.450700 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5t8w"] Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.464721 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5t8w"] Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.468625 4740 scope.go:117] "RemoveContainer" containerID="79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.518929 4740 scope.go:117] "RemoveContainer" containerID="a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48" Jan 05 14:22:09 crc kubenswrapper[4740]: E0105 14:22:09.519989 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48\": container with ID starting with a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48 not found: ID does not exist" containerID="a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.520023 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48"} err="failed to get container status \"a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48\": rpc error: code = NotFound desc = could not find container \"a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48\": container with ID starting with a3f0a27573e0358fad0cc2ec8e04cc76a5dae42d7483c25842fae53a02a01f48 not found: ID does not exist" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.520076 4740 scope.go:117] "RemoveContainer" containerID="706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007" Jan 05 14:22:09 crc kubenswrapper[4740]: E0105 14:22:09.520467 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007\": container with ID starting with 706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007 not found: ID does not exist" containerID="706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.520509 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007"} err="failed to get container status \"706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007\": rpc error: code = NotFound desc = could not find container \"706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007\": container with ID starting with 706c5f9c26e779da00ac35532d0546cc3f7e13c7990ef05acb0aeea8121aa007 not found: ID does not exist" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.520536 4740 scope.go:117] "RemoveContainer" containerID="79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98" Jan 05 14:22:09 crc kubenswrapper[4740]: E0105 14:22:09.520835 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98\": container with ID starting with 79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98 not found: ID does not exist" containerID="79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98" Jan 05 14:22:09 crc kubenswrapper[4740]: I0105 14:22:09.520868 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98"} err="failed to get container status \"79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98\": rpc error: code = NotFound desc = could not find container \"79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98\": container with ID starting with 79cd3c7cf14d21ee52c9823604ae9529d11d6dbcc913fd986632f12b381f9e98 not found: ID does not exist" Jan 05 14:22:10 crc kubenswrapper[4740]: I0105 14:22:10.988909 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" path="/var/lib/kubelet/pods/cffe6fd6-ea8b-4919-b91a-b80de69cd3c7/volumes" Jan 05 14:22:11 crc kubenswrapper[4740]: I0105 14:22:11.968491 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:22:11 crc kubenswrapper[4740]: E0105 14:22:11.969104 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:22:24 crc kubenswrapper[4740]: I0105 14:22:24.968367 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:22:24 crc kubenswrapper[4740]: E0105 14:22:24.969181 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:22:32 crc kubenswrapper[4740]: I0105 14:22:32.587436 4740 scope.go:117] "RemoveContainer" containerID="c4f147b5d5b4fddb6922d8baff48cd5c3c767f451bf4b031566a8a1a8c980899" Jan 05 14:22:38 crc kubenswrapper[4740]: I0105 14:22:38.907027 4740 generic.go:334] "Generic (PLEG): container finished" podID="fd7ca708-94a1-4016-a666-8a9b1eb34a62" containerID="11ee6c215e1d500d721a8090f406d65800100508142accafb7491a4a887fd092" exitCode=0 Jan 05 14:22:38 crc kubenswrapper[4740]: I0105 14:22:38.907112 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" event={"ID":"fd7ca708-94a1-4016-a666-8a9b1eb34a62","Type":"ContainerDied","Data":"11ee6c215e1d500d721a8090f406d65800100508142accafb7491a4a887fd092"} Jan 05 14:22:38 crc kubenswrapper[4740]: I0105 14:22:38.969114 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:22:39 crc kubenswrapper[4740]: I0105 14:22:39.928545 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"87db5e19923a3a4961dbb7d28184d9af48401a591023a911c3f7453ebbf03439"} Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.404425 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.601469 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-ssh-key\") pod \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.601578 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-inventory\") pod \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.601613 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbw7f\" (UniqueName: \"kubernetes.io/projected/fd7ca708-94a1-4016-a666-8a9b1eb34a62-kube-api-access-pbw7f\") pod \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\" (UID: \"fd7ca708-94a1-4016-a666-8a9b1eb34a62\") " Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.607338 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7ca708-94a1-4016-a666-8a9b1eb34a62-kube-api-access-pbw7f" (OuterVolumeSpecName: "kube-api-access-pbw7f") pod "fd7ca708-94a1-4016-a666-8a9b1eb34a62" (UID: "fd7ca708-94a1-4016-a666-8a9b1eb34a62"). InnerVolumeSpecName "kube-api-access-pbw7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.632376 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd7ca708-94a1-4016-a666-8a9b1eb34a62" (UID: "fd7ca708-94a1-4016-a666-8a9b1eb34a62"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.633473 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-inventory" (OuterVolumeSpecName: "inventory") pod "fd7ca708-94a1-4016-a666-8a9b1eb34a62" (UID: "fd7ca708-94a1-4016-a666-8a9b1eb34a62"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.707136 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.707193 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd7ca708-94a1-4016-a666-8a9b1eb34a62-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.707218 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbw7f\" (UniqueName: \"kubernetes.io/projected/fd7ca708-94a1-4016-a666-8a9b1eb34a62-kube-api-access-pbw7f\") on node \"crc\" DevicePath \"\"" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.941384 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" event={"ID":"fd7ca708-94a1-4016-a666-8a9b1eb34a62","Type":"ContainerDied","Data":"06f6add547dcd7e0e4c633aab25310a72d03e8d5c184fa324c8f3c91625613a3"} Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.941698 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06f6add547dcd7e0e4c633aab25310a72d03e8d5c184fa324c8f3c91625613a3" Jan 05 14:22:40 crc kubenswrapper[4740]: I0105 14:22:40.941433 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xtb2g" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.090538 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz"] Jan 05 14:22:41 crc kubenswrapper[4740]: E0105 14:22:41.091213 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="registry-server" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.091238 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="registry-server" Jan 05 14:22:41 crc kubenswrapper[4740]: E0105 14:22:41.091274 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="extract-utilities" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.091283 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="extract-utilities" Jan 05 14:22:41 crc kubenswrapper[4740]: E0105 14:22:41.091297 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="extract-content" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.091317 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="extract-content" Jan 05 14:22:41 crc kubenswrapper[4740]: E0105 14:22:41.091336 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7ca708-94a1-4016-a666-8a9b1eb34a62" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.091348 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7ca708-94a1-4016-a666-8a9b1eb34a62" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.091611 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cffe6fd6-ea8b-4919-b91a-b80de69cd3c7" containerName="registry-server" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.091657 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7ca708-94a1-4016-a666-8a9b1eb34a62" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.092553 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.099550 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.099776 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.100496 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.102410 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.105787 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz"] Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.118842 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.119193 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvrk\" (UniqueName: \"kubernetes.io/projected/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-kube-api-access-vgvrk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.119501 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.222389 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.222552 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvrk\" (UniqueName: \"kubernetes.io/projected/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-kube-api-access-vgvrk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.222633 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.227959 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.228006 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.241591 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvrk\" (UniqueName: \"kubernetes.io/projected/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-kube-api-access-vgvrk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:41 crc kubenswrapper[4740]: I0105 14:22:41.421564 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:22:42 crc kubenswrapper[4740]: I0105 14:22:42.071546 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz"] Jan 05 14:22:42 crc kubenswrapper[4740]: W0105 14:22:42.074654 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa2b5d30_c6dd_4d84_b33f_b3e855e61b24.slice/crio-da4e5ed0e4be515519d9801b754ccae9a18b58a4fddda44726929bdbbae220f0 WatchSource:0}: Error finding container da4e5ed0e4be515519d9801b754ccae9a18b58a4fddda44726929bdbbae220f0: Status 404 returned error can't find the container with id da4e5ed0e4be515519d9801b754ccae9a18b58a4fddda44726929bdbbae220f0 Jan 05 14:22:42 crc kubenswrapper[4740]: I0105 14:22:42.987220 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" event={"ID":"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24","Type":"ContainerStarted","Data":"e1b187c888b22939f0ea3bd031c8e0cf6edab909eacdf5419181a99dfa3f8f6d"} Jan 05 14:22:42 crc kubenswrapper[4740]: I0105 14:22:42.988145 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" event={"ID":"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24","Type":"ContainerStarted","Data":"da4e5ed0e4be515519d9801b754ccae9a18b58a4fddda44726929bdbbae220f0"} Jan 05 14:22:43 crc kubenswrapper[4740]: I0105 14:22:43.002703 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" podStartSLOduration=1.505572762 podStartE2EDuration="2.00267739s" podCreationTimestamp="2026-01-05 14:22:41 +0000 UTC" firstStartedPulling="2026-01-05 14:22:42.078498359 +0000 UTC m=+2011.385406948" lastFinishedPulling="2026-01-05 14:22:42.575602997 +0000 UTC m=+2011.882511576" observedRunningTime="2026-01-05 14:22:42.999426741 +0000 UTC m=+2012.306335330" watchObservedRunningTime="2026-01-05 14:22:43.00267739 +0000 UTC m=+2012.309585979" Jan 05 14:23:40 crc kubenswrapper[4740]: I0105 14:23:40.755384 4740 generic.go:334] "Generic (PLEG): container finished" podID="aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" containerID="e1b187c888b22939f0ea3bd031c8e0cf6edab909eacdf5419181a99dfa3f8f6d" exitCode=0 Jan 05 14:23:40 crc kubenswrapper[4740]: I0105 14:23:40.756051 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" event={"ID":"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24","Type":"ContainerDied","Data":"e1b187c888b22939f0ea3bd031c8e0cf6edab909eacdf5419181a99dfa3f8f6d"} Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.430458 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.596568 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-ssh-key\") pod \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.596917 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgvrk\" (UniqueName: \"kubernetes.io/projected/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-kube-api-access-vgvrk\") pod \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.597096 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-inventory\") pod \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\" (UID: \"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24\") " Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.605508 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-kube-api-access-vgvrk" (OuterVolumeSpecName: "kube-api-access-vgvrk") pod "aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" (UID: "aa2b5d30-c6dd-4d84-b33f-b3e855e61b24"). InnerVolumeSpecName "kube-api-access-vgvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.628666 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" (UID: "aa2b5d30-c6dd-4d84-b33f-b3e855e61b24"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.639758 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-inventory" (OuterVolumeSpecName: "inventory") pod "aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" (UID: "aa2b5d30-c6dd-4d84-b33f-b3e855e61b24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.700615 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.700675 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.700696 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgvrk\" (UniqueName: \"kubernetes.io/projected/aa2b5d30-c6dd-4d84-b33f-b3e855e61b24-kube-api-access-vgvrk\") on node \"crc\" DevicePath \"\"" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.780539 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" event={"ID":"aa2b5d30-c6dd-4d84-b33f-b3e855e61b24","Type":"ContainerDied","Data":"da4e5ed0e4be515519d9801b754ccae9a18b58a4fddda44726929bdbbae220f0"} Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.780599 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4e5ed0e4be515519d9801b754ccae9a18b58a4fddda44726929bdbbae220f0" Jan 05 14:23:42 crc kubenswrapper[4740]: I0105 14:23:42.780658 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.020831 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-lwc8v"] Jan 05 14:23:43 crc kubenswrapper[4740]: E0105 14:23:43.021537 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.021561 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.021886 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2b5d30-c6dd-4d84-b33f-b3e855e61b24" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.022926 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.026280 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.026793 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.026918 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.029013 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.033809 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-lwc8v"] Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.114926 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.114990 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjhm\" (UniqueName: \"kubernetes.io/projected/c0946f8e-100f-4ceb-9766-2254ec001229-kube-api-access-ffjhm\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.115095 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.216798 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjhm\" (UniqueName: \"kubernetes.io/projected/c0946f8e-100f-4ceb-9766-2254ec001229-kube-api-access-ffjhm\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.216890 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.217139 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.222858 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.224176 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.233354 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjhm\" (UniqueName: \"kubernetes.io/projected/c0946f8e-100f-4ceb-9766-2254ec001229-kube-api-access-ffjhm\") pod \"ssh-known-hosts-edpm-deployment-lwc8v\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:43 crc kubenswrapper[4740]: I0105 14:23:43.368430 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:44 crc kubenswrapper[4740]: I0105 14:23:44.012034 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-lwc8v"] Jan 05 14:23:44 crc kubenswrapper[4740]: I0105 14:23:44.021805 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:23:44 crc kubenswrapper[4740]: I0105 14:23:44.806710 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" event={"ID":"c0946f8e-100f-4ceb-9766-2254ec001229","Type":"ContainerStarted","Data":"aa7a701997fcfbf8d8954b3f47a4694578d660a85ce49dd3879a1bef14f9a36e"} Jan 05 14:23:45 crc kubenswrapper[4740]: I0105 14:23:45.823490 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" event={"ID":"c0946f8e-100f-4ceb-9766-2254ec001229","Type":"ContainerStarted","Data":"b0950a0edcacd056f8f772c439f05b60eb321adf1177c18e6bee8151ff2f74c5"} Jan 05 14:23:45 crc kubenswrapper[4740]: I0105 14:23:45.856250 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" podStartSLOduration=3.315222835 podStartE2EDuration="3.856219766s" podCreationTimestamp="2026-01-05 14:23:42 +0000 UTC" firstStartedPulling="2026-01-05 14:23:44.021560067 +0000 UTC m=+2073.328468646" lastFinishedPulling="2026-01-05 14:23:44.562556998 +0000 UTC m=+2073.869465577" observedRunningTime="2026-01-05 14:23:45.845936906 +0000 UTC m=+2075.152845525" watchObservedRunningTime="2026-01-05 14:23:45.856219766 +0000 UTC m=+2075.163128375" Jan 05 14:23:48 crc kubenswrapper[4740]: I0105 14:23:48.065681 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-snmgh"] Jan 05 14:23:48 crc kubenswrapper[4740]: I0105 14:23:48.093743 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-snmgh"] Jan 05 14:23:48 crc kubenswrapper[4740]: I0105 14:23:48.985561 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ef5055-0453-443e-b7cc-8fd93041e899" path="/var/lib/kubelet/pods/04ef5055-0453-443e-b7cc-8fd93041e899/volumes" Jan 05 14:23:52 crc kubenswrapper[4740]: I0105 14:23:52.917449 4740 generic.go:334] "Generic (PLEG): container finished" podID="c0946f8e-100f-4ceb-9766-2254ec001229" containerID="b0950a0edcacd056f8f772c439f05b60eb321adf1177c18e6bee8151ff2f74c5" exitCode=0 Jan 05 14:23:52 crc kubenswrapper[4740]: I0105 14:23:52.917576 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" event={"ID":"c0946f8e-100f-4ceb-9766-2254ec001229","Type":"ContainerDied","Data":"b0950a0edcacd056f8f772c439f05b60eb321adf1177c18e6bee8151ff2f74c5"} Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.491270 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.633281 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjhm\" (UniqueName: \"kubernetes.io/projected/c0946f8e-100f-4ceb-9766-2254ec001229-kube-api-access-ffjhm\") pod \"c0946f8e-100f-4ceb-9766-2254ec001229\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.633491 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-ssh-key-openstack-edpm-ipam\") pod \"c0946f8e-100f-4ceb-9766-2254ec001229\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.633703 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-inventory-0\") pod \"c0946f8e-100f-4ceb-9766-2254ec001229\" (UID: \"c0946f8e-100f-4ceb-9766-2254ec001229\") " Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.639005 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0946f8e-100f-4ceb-9766-2254ec001229-kube-api-access-ffjhm" (OuterVolumeSpecName: "kube-api-access-ffjhm") pod "c0946f8e-100f-4ceb-9766-2254ec001229" (UID: "c0946f8e-100f-4ceb-9766-2254ec001229"). InnerVolumeSpecName "kube-api-access-ffjhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.682390 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c0946f8e-100f-4ceb-9766-2254ec001229" (UID: "c0946f8e-100f-4ceb-9766-2254ec001229"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.686370 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0946f8e-100f-4ceb-9766-2254ec001229" (UID: "c0946f8e-100f-4ceb-9766-2254ec001229"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.736124 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjhm\" (UniqueName: \"kubernetes.io/projected/c0946f8e-100f-4ceb-9766-2254ec001229-kube-api-access-ffjhm\") on node \"crc\" DevicePath \"\"" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.736152 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.736162 4740 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c0946f8e-100f-4ceb-9766-2254ec001229-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.945016 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" event={"ID":"c0946f8e-100f-4ceb-9766-2254ec001229","Type":"ContainerDied","Data":"aa7a701997fcfbf8d8954b3f47a4694578d660a85ce49dd3879a1bef14f9a36e"} Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.945886 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa7a701997fcfbf8d8954b3f47a4694578d660a85ce49dd3879a1bef14f9a36e" Jan 05 14:23:54 crc kubenswrapper[4740]: I0105 14:23:54.945080 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-lwc8v" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.033545 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm"] Jan 05 14:23:55 crc kubenswrapper[4740]: E0105 14:23:55.034188 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0946f8e-100f-4ceb-9766-2254ec001229" containerName="ssh-known-hosts-edpm-deployment" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.034210 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0946f8e-100f-4ceb-9766-2254ec001229" containerName="ssh-known-hosts-edpm-deployment" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.034542 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0946f8e-100f-4ceb-9766-2254ec001229" containerName="ssh-known-hosts-edpm-deployment" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.035566 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.042951 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.043217 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.043322 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.045901 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.054140 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm"] Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.145765 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlnj\" (UniqueName: \"kubernetes.io/projected/616aed46-7a36-4531-afca-003df9bfb20d-kube-api-access-vjlnj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.147645 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.147813 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.249969 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.251527 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlnj\" (UniqueName: \"kubernetes.io/projected/616aed46-7a36-4531-afca-003df9bfb20d-kube-api-access-vjlnj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.251682 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.255194 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.257559 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.277903 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlnj\" (UniqueName: \"kubernetes.io/projected/616aed46-7a36-4531-afca-003df9bfb20d-kube-api-access-vjlnj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-4xzrm\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.360889 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:23:55 crc kubenswrapper[4740]: W0105 14:23:55.937612 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod616aed46_7a36_4531_afca_003df9bfb20d.slice/crio-d4d577fc2cbb493f20aa058ff23c726f6a285b0bc12506b14aa580fa8aa03df8 WatchSource:0}: Error finding container d4d577fc2cbb493f20aa058ff23c726f6a285b0bc12506b14aa580fa8aa03df8: Status 404 returned error can't find the container with id d4d577fc2cbb493f20aa058ff23c726f6a285b0bc12506b14aa580fa8aa03df8 Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.950902 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm"] Jan 05 14:23:55 crc kubenswrapper[4740]: I0105 14:23:55.961666 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" event={"ID":"616aed46-7a36-4531-afca-003df9bfb20d","Type":"ContainerStarted","Data":"d4d577fc2cbb493f20aa058ff23c726f6a285b0bc12506b14aa580fa8aa03df8"} Jan 05 14:23:57 crc kubenswrapper[4740]: I0105 14:23:57.981335 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" event={"ID":"616aed46-7a36-4531-afca-003df9bfb20d","Type":"ContainerStarted","Data":"29bc6d31b9b1fa70565be179f54b65768d31be54e448729a36ba14e8785a8830"} Jan 05 14:23:57 crc kubenswrapper[4740]: I0105 14:23:57.997426 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" podStartSLOduration=2.267170714 podStartE2EDuration="2.997410231s" podCreationTimestamp="2026-01-05 14:23:55 +0000 UTC" firstStartedPulling="2026-01-05 14:23:55.941467616 +0000 UTC m=+2085.248376205" lastFinishedPulling="2026-01-05 14:23:56.671707113 +0000 UTC m=+2085.978615722" observedRunningTime="2026-01-05 14:23:57.993239517 +0000 UTC m=+2087.300148096" watchObservedRunningTime="2026-01-05 14:23:57.997410231 +0000 UTC m=+2087.304318810" Jan 05 14:24:06 crc kubenswrapper[4740]: I0105 14:24:06.074926 4740 generic.go:334] "Generic (PLEG): container finished" podID="616aed46-7a36-4531-afca-003df9bfb20d" containerID="29bc6d31b9b1fa70565be179f54b65768d31be54e448729a36ba14e8785a8830" exitCode=0 Jan 05 14:24:06 crc kubenswrapper[4740]: I0105 14:24:06.075158 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" event={"ID":"616aed46-7a36-4531-afca-003df9bfb20d","Type":"ContainerDied","Data":"29bc6d31b9b1fa70565be179f54b65768d31be54e448729a36ba14e8785a8830"} Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.631338 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.797510 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjlnj\" (UniqueName: \"kubernetes.io/projected/616aed46-7a36-4531-afca-003df9bfb20d-kube-api-access-vjlnj\") pod \"616aed46-7a36-4531-afca-003df9bfb20d\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.798204 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-ssh-key\") pod \"616aed46-7a36-4531-afca-003df9bfb20d\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.798349 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-inventory\") pod \"616aed46-7a36-4531-afca-003df9bfb20d\" (UID: \"616aed46-7a36-4531-afca-003df9bfb20d\") " Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.804710 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616aed46-7a36-4531-afca-003df9bfb20d-kube-api-access-vjlnj" (OuterVolumeSpecName: "kube-api-access-vjlnj") pod "616aed46-7a36-4531-afca-003df9bfb20d" (UID: "616aed46-7a36-4531-afca-003df9bfb20d"). InnerVolumeSpecName "kube-api-access-vjlnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.831213 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "616aed46-7a36-4531-afca-003df9bfb20d" (UID: "616aed46-7a36-4531-afca-003df9bfb20d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.860408 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-inventory" (OuterVolumeSpecName: "inventory") pod "616aed46-7a36-4531-afca-003df9bfb20d" (UID: "616aed46-7a36-4531-afca-003df9bfb20d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.901827 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjlnj\" (UniqueName: \"kubernetes.io/projected/616aed46-7a36-4531-afca-003df9bfb20d-kube-api-access-vjlnj\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.901885 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:07 crc kubenswrapper[4740]: I0105 14:24:07.901897 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616aed46-7a36-4531-afca-003df9bfb20d-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.096014 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" event={"ID":"616aed46-7a36-4531-afca-003df9bfb20d","Type":"ContainerDied","Data":"d4d577fc2cbb493f20aa058ff23c726f6a285b0bc12506b14aa580fa8aa03df8"} Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.096300 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d577fc2cbb493f20aa058ff23c726f6a285b0bc12506b14aa580fa8aa03df8" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.096037 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-4xzrm" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.177501 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw"] Jan 05 14:24:08 crc kubenswrapper[4740]: E0105 14:24:08.178142 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616aed46-7a36-4531-afca-003df9bfb20d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.178168 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="616aed46-7a36-4531-afca-003df9bfb20d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.178488 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="616aed46-7a36-4531-afca-003df9bfb20d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.179620 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.182513 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.182520 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.182549 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.187093 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.189465 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw"] Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.312770 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.312843 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.312937 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2khg9\" (UniqueName: \"kubernetes.io/projected/14251f6a-60c2-4493-9899-d61cf7f1a907-kube-api-access-2khg9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.415540 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.415869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2khg9\" (UniqueName: \"kubernetes.io/projected/14251f6a-60c2-4493-9899-d61cf7f1a907-kube-api-access-2khg9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.416143 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.420897 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.421217 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.446667 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2khg9\" (UniqueName: \"kubernetes.io/projected/14251f6a-60c2-4493-9899-d61cf7f1a907-kube-api-access-2khg9\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:08 crc kubenswrapper[4740]: I0105 14:24:08.500111 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:09 crc kubenswrapper[4740]: I0105 14:24:09.150820 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw"] Jan 05 14:24:10 crc kubenswrapper[4740]: I0105 14:24:10.150200 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" event={"ID":"14251f6a-60c2-4493-9899-d61cf7f1a907","Type":"ContainerStarted","Data":"4d2d3401be8a57e352902dd050c423e4391decf7be9c00b2ad2ba8036d05b307"} Jan 05 14:24:11 crc kubenswrapper[4740]: I0105 14:24:11.163089 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" event={"ID":"14251f6a-60c2-4493-9899-d61cf7f1a907","Type":"ContainerStarted","Data":"b66c501efaccaa0e4ff6d35f90cbc09c0b2fcfb4cfe51137ce7b2b8ddb2dd075"} Jan 05 14:24:11 crc kubenswrapper[4740]: I0105 14:24:11.210724 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" podStartSLOduration=2.48552052 podStartE2EDuration="3.21070426s" podCreationTimestamp="2026-01-05 14:24:08 +0000 UTC" firstStartedPulling="2026-01-05 14:24:09.128611612 +0000 UTC m=+2098.435520191" lastFinishedPulling="2026-01-05 14:24:09.853795352 +0000 UTC m=+2099.160703931" observedRunningTime="2026-01-05 14:24:11.199820664 +0000 UTC m=+2100.506729353" watchObservedRunningTime="2026-01-05 14:24:11.21070426 +0000 UTC m=+2100.517612839" Jan 05 14:24:11 crc kubenswrapper[4740]: I0105 14:24:11.857542 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gggkm"] Jan 05 14:24:11 crc kubenswrapper[4740]: I0105 14:24:11.869984 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:11 crc kubenswrapper[4740]: I0105 14:24:11.903710 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gggkm"] Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.016774 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-catalog-content\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.016864 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4pmg\" (UniqueName: \"kubernetes.io/projected/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-kube-api-access-k4pmg\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.017009 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-utilities\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.119473 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-catalog-content\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.119554 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4pmg\" (UniqueName: \"kubernetes.io/projected/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-kube-api-access-k4pmg\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.119610 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-utilities\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.119983 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-catalog-content\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.120347 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-utilities\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.149972 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4pmg\" (UniqueName: \"kubernetes.io/projected/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-kube-api-access-k4pmg\") pod \"community-operators-gggkm\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.196001 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:12 crc kubenswrapper[4740]: I0105 14:24:12.721644 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gggkm"] Jan 05 14:24:12 crc kubenswrapper[4740]: W0105 14:24:12.731502 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9ede8c_ee30_4cb9_af42_393b6dfdef79.slice/crio-1dc16eb390a203c054ea71d4e22bd24d72a5e6ed283a3168598c8a11464ad1b0 WatchSource:0}: Error finding container 1dc16eb390a203c054ea71d4e22bd24d72a5e6ed283a3168598c8a11464ad1b0: Status 404 returned error can't find the container with id 1dc16eb390a203c054ea71d4e22bd24d72a5e6ed283a3168598c8a11464ad1b0 Jan 05 14:24:13 crc kubenswrapper[4740]: I0105 14:24:13.195845 4740 generic.go:334] "Generic (PLEG): container finished" podID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerID="10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1" exitCode=0 Jan 05 14:24:13 crc kubenswrapper[4740]: I0105 14:24:13.195921 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerDied","Data":"10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1"} Jan 05 14:24:13 crc kubenswrapper[4740]: I0105 14:24:13.196176 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerStarted","Data":"1dc16eb390a203c054ea71d4e22bd24d72a5e6ed283a3168598c8a11464ad1b0"} Jan 05 14:24:14 crc kubenswrapper[4740]: I0105 14:24:14.208679 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerStarted","Data":"823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24"} Jan 05 14:24:17 crc kubenswrapper[4740]: I0105 14:24:17.243439 4740 generic.go:334] "Generic (PLEG): container finished" podID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerID="823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24" exitCode=0 Jan 05 14:24:17 crc kubenswrapper[4740]: I0105 14:24:17.243503 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerDied","Data":"823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24"} Jan 05 14:24:18 crc kubenswrapper[4740]: I0105 14:24:18.256835 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerStarted","Data":"ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7"} Jan 05 14:24:18 crc kubenswrapper[4740]: I0105 14:24:18.281273 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gggkm" podStartSLOduration=2.72724515 podStartE2EDuration="7.281252263s" podCreationTimestamp="2026-01-05 14:24:11 +0000 UTC" firstStartedPulling="2026-01-05 14:24:13.197713071 +0000 UTC m=+2102.504621690" lastFinishedPulling="2026-01-05 14:24:17.751720224 +0000 UTC m=+2107.058628803" observedRunningTime="2026-01-05 14:24:18.274818928 +0000 UTC m=+2107.581727507" watchObservedRunningTime="2026-01-05 14:24:18.281252263 +0000 UTC m=+2107.588160842" Jan 05 14:24:21 crc kubenswrapper[4740]: I0105 14:24:21.289630 4740 generic.go:334] "Generic (PLEG): container finished" podID="14251f6a-60c2-4493-9899-d61cf7f1a907" containerID="b66c501efaccaa0e4ff6d35f90cbc09c0b2fcfb4cfe51137ce7b2b8ddb2dd075" exitCode=0 Jan 05 14:24:21 crc kubenswrapper[4740]: I0105 14:24:21.289768 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" event={"ID":"14251f6a-60c2-4493-9899-d61cf7f1a907","Type":"ContainerDied","Data":"b66c501efaccaa0e4ff6d35f90cbc09c0b2fcfb4cfe51137ce7b2b8ddb2dd075"} Jan 05 14:24:22 crc kubenswrapper[4740]: I0105 14:24:22.949191 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:22 crc kubenswrapper[4740]: I0105 14:24:22.949480 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.105244 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.584792 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.597681 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-ssh-key\") pod \"14251f6a-60c2-4493-9899-d61cf7f1a907\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.645361 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14251f6a-60c2-4493-9899-d61cf7f1a907" (UID: "14251f6a-60c2-4493-9899-d61cf7f1a907"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.699404 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2khg9\" (UniqueName: \"kubernetes.io/projected/14251f6a-60c2-4493-9899-d61cf7f1a907-kube-api-access-2khg9\") pod \"14251f6a-60c2-4493-9899-d61cf7f1a907\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.699487 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-inventory\") pod \"14251f6a-60c2-4493-9899-d61cf7f1a907\" (UID: \"14251f6a-60c2-4493-9899-d61cf7f1a907\") " Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.699995 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.706208 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14251f6a-60c2-4493-9899-d61cf7f1a907-kube-api-access-2khg9" (OuterVolumeSpecName: "kube-api-access-2khg9") pod "14251f6a-60c2-4493-9899-d61cf7f1a907" (UID: "14251f6a-60c2-4493-9899-d61cf7f1a907"). InnerVolumeSpecName "kube-api-access-2khg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.730543 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-inventory" (OuterVolumeSpecName: "inventory") pod "14251f6a-60c2-4493-9899-d61cf7f1a907" (UID: "14251f6a-60c2-4493-9899-d61cf7f1a907"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.802883 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2khg9\" (UniqueName: \"kubernetes.io/projected/14251f6a-60c2-4493-9899-d61cf7f1a907-kube-api-access-2khg9\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.802930 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14251f6a-60c2-4493-9899-d61cf7f1a907-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.975489 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" event={"ID":"14251f6a-60c2-4493-9899-d61cf7f1a907","Type":"ContainerDied","Data":"4d2d3401be8a57e352902dd050c423e4391decf7be9c00b2ad2ba8036d05b307"} Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.975834 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2d3401be8a57e352902dd050c423e4391decf7be9c00b2ad2ba8036d05b307" Jan 05 14:24:23 crc kubenswrapper[4740]: I0105 14:24:23.975536 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.042564 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.101655 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gggkm"] Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.197017 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9"] Jan 05 14:24:24 crc kubenswrapper[4740]: E0105 14:24:24.197548 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14251f6a-60c2-4493-9899-d61cf7f1a907" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.197575 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="14251f6a-60c2-4493-9899-d61cf7f1a907" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.197835 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="14251f6a-60c2-4493-9899-d61cf7f1a907" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.198762 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.200971 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.201347 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.203479 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.203543 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.203648 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.204056 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.204145 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.204485 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.208605 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213478 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213571 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213603 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213710 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213743 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213773 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213820 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213849 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213890 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213906 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213950 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.213969 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.214026 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gztgz\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-kube-api-access-gztgz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.214076 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.214126 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.227572 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9"] Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.315748 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gztgz\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-kube-api-access-gztgz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.315808 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.315880 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.315930 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.315962 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.315981 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316034 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316104 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316134 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316181 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316221 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316243 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316276 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316299 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316336 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.316359 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.324283 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.324573 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.325243 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.328809 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.329209 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.330363 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.334027 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.334135 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gztgz\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-kube-api-access-gztgz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.334503 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.335873 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.336509 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.336689 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.337074 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.337680 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.338539 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.338591 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:24 crc kubenswrapper[4740]: I0105 14:24:24.523666 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:24:25 crc kubenswrapper[4740]: I0105 14:24:25.272964 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9"] Jan 05 14:24:25 crc kubenswrapper[4740]: I0105 14:24:25.997992 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gggkm" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="registry-server" containerID="cri-o://ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7" gracePeriod=2 Jan 05 14:24:25 crc kubenswrapper[4740]: I0105 14:24:25.998300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" event={"ID":"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb","Type":"ContainerStarted","Data":"6ddbcfbd52352b4a2e894a0a1bcff45d6957fd18cea0f9ef68b7b452ef51e4f6"} Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.553129 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.679477 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4pmg\" (UniqueName: \"kubernetes.io/projected/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-kube-api-access-k4pmg\") pod \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.679749 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-utilities\") pod \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.679899 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-catalog-content\") pod \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\" (UID: \"fa9ede8c-ee30-4cb9-af42-393b6dfdef79\") " Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.683781 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-utilities" (OuterVolumeSpecName: "utilities") pod "fa9ede8c-ee30-4cb9-af42-393b6dfdef79" (UID: "fa9ede8c-ee30-4cb9-af42-393b6dfdef79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.684658 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-kube-api-access-k4pmg" (OuterVolumeSpecName: "kube-api-access-k4pmg") pod "fa9ede8c-ee30-4cb9-af42-393b6dfdef79" (UID: "fa9ede8c-ee30-4cb9-af42-393b6dfdef79"). InnerVolumeSpecName "kube-api-access-k4pmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.753460 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa9ede8c-ee30-4cb9-af42-393b6dfdef79" (UID: "fa9ede8c-ee30-4cb9-af42-393b6dfdef79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.782518 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.782552 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4pmg\" (UniqueName: \"kubernetes.io/projected/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-kube-api-access-k4pmg\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:26 crc kubenswrapper[4740]: I0105 14:24:26.782579 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9ede8c-ee30-4cb9-af42-393b6dfdef79-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.010035 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" event={"ID":"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb","Type":"ContainerStarted","Data":"b97d1ee9960532035743e8f7456ebdebf99d60a547f231a6f7431d317fa7ddf6"} Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.012555 4740 generic.go:334] "Generic (PLEG): container finished" podID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerID="ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7" exitCode=0 Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.012596 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerDied","Data":"ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7"} Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.012639 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggkm" event={"ID":"fa9ede8c-ee30-4cb9-af42-393b6dfdef79","Type":"ContainerDied","Data":"1dc16eb390a203c054ea71d4e22bd24d72a5e6ed283a3168598c8a11464ad1b0"} Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.012657 4740 scope.go:117] "RemoveContainer" containerID="ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.012672 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggkm" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.031672 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" podStartSLOduration=2.53866133 podStartE2EDuration="3.031652655s" podCreationTimestamp="2026-01-05 14:24:24 +0000 UTC" firstStartedPulling="2026-01-05 14:24:25.28239943 +0000 UTC m=+2114.589308009" lastFinishedPulling="2026-01-05 14:24:25.775390755 +0000 UTC m=+2115.082299334" observedRunningTime="2026-01-05 14:24:27.031136201 +0000 UTC m=+2116.338044790" watchObservedRunningTime="2026-01-05 14:24:27.031652655 +0000 UTC m=+2116.338561234" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.058340 4740 scope.go:117] "RemoveContainer" containerID="823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.070074 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gggkm"] Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.079742 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gggkm"] Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.096859 4740 scope.go:117] "RemoveContainer" containerID="10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.158287 4740 scope.go:117] "RemoveContainer" containerID="ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7" Jan 05 14:24:27 crc kubenswrapper[4740]: E0105 14:24:27.158768 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7\": container with ID starting with ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7 not found: ID does not exist" containerID="ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.158824 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7"} err="failed to get container status \"ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7\": rpc error: code = NotFound desc = could not find container \"ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7\": container with ID starting with ed1382f2ba3ebe7f759a46a57fb84874e770d8e86127c1cb4fa05241caa439a7 not found: ID does not exist" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.158856 4740 scope.go:117] "RemoveContainer" containerID="823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24" Jan 05 14:24:27 crc kubenswrapper[4740]: E0105 14:24:27.159311 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24\": container with ID starting with 823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24 not found: ID does not exist" containerID="823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.159349 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24"} err="failed to get container status \"823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24\": rpc error: code = NotFound desc = could not find container \"823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24\": container with ID starting with 823d4a59f185aa91595098690848134df11868001973c2a9a6ee52ce47026d24 not found: ID does not exist" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.159374 4740 scope.go:117] "RemoveContainer" containerID="10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1" Jan 05 14:24:27 crc kubenswrapper[4740]: E0105 14:24:27.159744 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1\": container with ID starting with 10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1 not found: ID does not exist" containerID="10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1" Jan 05 14:24:27 crc kubenswrapper[4740]: I0105 14:24:27.159771 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1"} err="failed to get container status \"10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1\": rpc error: code = NotFound desc = could not find container \"10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1\": container with ID starting with 10f146edcd7fab8c44b8e360b0ad1e29031ef05cb5c7cfdb8dd08ab42ccc04b1 not found: ID does not exist" Jan 05 14:24:28 crc kubenswrapper[4740]: I0105 14:24:28.984512 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" path="/var/lib/kubelet/pods/fa9ede8c-ee30-4cb9-af42-393b6dfdef79/volumes" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.219259 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nbzls"] Jan 05 14:24:32 crc kubenswrapper[4740]: E0105 14:24:32.220470 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="registry-server" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.220488 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="registry-server" Jan 05 14:24:32 crc kubenswrapper[4740]: E0105 14:24:32.220509 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="extract-content" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.220521 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="extract-content" Jan 05 14:24:32 crc kubenswrapper[4740]: E0105 14:24:32.220548 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="extract-utilities" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.220562 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="extract-utilities" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.220872 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9ede8c-ee30-4cb9-af42-393b6dfdef79" containerName="registry-server" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.223133 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.234893 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbzls"] Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.338147 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-utilities\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.338299 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-catalog-content\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.338413 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nwr\" (UniqueName: \"kubernetes.io/projected/9150492e-80b6-40cc-b84f-4631111d863d-kube-api-access-w9nwr\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.440969 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-catalog-content\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.441104 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nwr\" (UniqueName: \"kubernetes.io/projected/9150492e-80b6-40cc-b84f-4631111d863d-kube-api-access-w9nwr\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.441191 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-utilities\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.441665 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-utilities\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.441710 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-catalog-content\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.466374 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nwr\" (UniqueName: \"kubernetes.io/projected/9150492e-80b6-40cc-b84f-4631111d863d-kube-api-access-w9nwr\") pod \"redhat-marketplace-nbzls\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.555324 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:32 crc kubenswrapper[4740]: I0105 14:24:32.714784 4740 scope.go:117] "RemoveContainer" containerID="43116dd67a5b84441d88474b594ce4b48a671c0a95f4650be9f6aab637a5b688" Jan 05 14:24:33 crc kubenswrapper[4740]: I0105 14:24:33.045895 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-whbn9"] Jan 05 14:24:33 crc kubenswrapper[4740]: I0105 14:24:33.056902 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-whbn9"] Jan 05 14:24:33 crc kubenswrapper[4740]: I0105 14:24:33.079717 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbzls"] Jan 05 14:24:33 crc kubenswrapper[4740]: W0105 14:24:33.084162 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9150492e_80b6_40cc_b84f_4631111d863d.slice/crio-529c451324edd8d25d38c54646e90ef1466cd0f9874bb5f2fde0117550fd01a0 WatchSource:0}: Error finding container 529c451324edd8d25d38c54646e90ef1466cd0f9874bb5f2fde0117550fd01a0: Status 404 returned error can't find the container with id 529c451324edd8d25d38c54646e90ef1466cd0f9874bb5f2fde0117550fd01a0 Jan 05 14:24:34 crc kubenswrapper[4740]: I0105 14:24:34.103326 4740 generic.go:334] "Generic (PLEG): container finished" podID="9150492e-80b6-40cc-b84f-4631111d863d" containerID="72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3" exitCode=0 Jan 05 14:24:34 crc kubenswrapper[4740]: I0105 14:24:34.103405 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerDied","Data":"72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3"} Jan 05 14:24:34 crc kubenswrapper[4740]: I0105 14:24:34.103543 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerStarted","Data":"529c451324edd8d25d38c54646e90ef1466cd0f9874bb5f2fde0117550fd01a0"} Jan 05 14:24:34 crc kubenswrapper[4740]: I0105 14:24:34.986032 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728565b1-c651-4b9e-a279-75ecf0e4eeb2" path="/var/lib/kubelet/pods/728565b1-c651-4b9e-a279-75ecf0e4eeb2/volumes" Jan 05 14:24:35 crc kubenswrapper[4740]: I0105 14:24:35.119118 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerStarted","Data":"d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af"} Jan 05 14:24:36 crc kubenswrapper[4740]: I0105 14:24:36.131766 4740 generic.go:334] "Generic (PLEG): container finished" podID="9150492e-80b6-40cc-b84f-4631111d863d" containerID="d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af" exitCode=0 Jan 05 14:24:36 crc kubenswrapper[4740]: I0105 14:24:36.131821 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerDied","Data":"d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af"} Jan 05 14:24:38 crc kubenswrapper[4740]: I0105 14:24:38.210582 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerStarted","Data":"5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d"} Jan 05 14:24:38 crc kubenswrapper[4740]: I0105 14:24:38.260057 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nbzls" podStartSLOduration=3.445261469 podStartE2EDuration="6.259950097s" podCreationTimestamp="2026-01-05 14:24:32 +0000 UTC" firstStartedPulling="2026-01-05 14:24:34.105518309 +0000 UTC m=+2123.412426888" lastFinishedPulling="2026-01-05 14:24:36.920206937 +0000 UTC m=+2126.227115516" observedRunningTime="2026-01-05 14:24:38.232496601 +0000 UTC m=+2127.539405180" watchObservedRunningTime="2026-01-05 14:24:38.259950097 +0000 UTC m=+2127.566858676" Jan 05 14:24:42 crc kubenswrapper[4740]: I0105 14:24:42.556327 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:42 crc kubenswrapper[4740]: I0105 14:24:42.557005 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:42 crc kubenswrapper[4740]: I0105 14:24:42.628120 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:43 crc kubenswrapper[4740]: I0105 14:24:43.330957 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:43 crc kubenswrapper[4740]: I0105 14:24:43.386403 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbzls"] Jan 05 14:24:45 crc kubenswrapper[4740]: I0105 14:24:45.296885 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nbzls" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="registry-server" containerID="cri-o://5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d" gracePeriod=2 Jan 05 14:24:45 crc kubenswrapper[4740]: I0105 14:24:45.869784 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.023135 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9nwr\" (UniqueName: \"kubernetes.io/projected/9150492e-80b6-40cc-b84f-4631111d863d-kube-api-access-w9nwr\") pod \"9150492e-80b6-40cc-b84f-4631111d863d\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.023236 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-utilities\") pod \"9150492e-80b6-40cc-b84f-4631111d863d\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.023383 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-catalog-content\") pod \"9150492e-80b6-40cc-b84f-4631111d863d\" (UID: \"9150492e-80b6-40cc-b84f-4631111d863d\") " Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.024172 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-utilities" (OuterVolumeSpecName: "utilities") pod "9150492e-80b6-40cc-b84f-4631111d863d" (UID: "9150492e-80b6-40cc-b84f-4631111d863d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.044905 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9150492e-80b6-40cc-b84f-4631111d863d" (UID: "9150492e-80b6-40cc-b84f-4631111d863d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.048699 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9150492e-80b6-40cc-b84f-4631111d863d-kube-api-access-w9nwr" (OuterVolumeSpecName: "kube-api-access-w9nwr") pod "9150492e-80b6-40cc-b84f-4631111d863d" (UID: "9150492e-80b6-40cc-b84f-4631111d863d"). InnerVolumeSpecName "kube-api-access-w9nwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.126512 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9nwr\" (UniqueName: \"kubernetes.io/projected/9150492e-80b6-40cc-b84f-4631111d863d-kube-api-access-w9nwr\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.126787 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.126879 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9150492e-80b6-40cc-b84f-4631111d863d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.307950 4740 generic.go:334] "Generic (PLEG): container finished" podID="9150492e-80b6-40cc-b84f-4631111d863d" containerID="5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d" exitCode=0 Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.307991 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerDied","Data":"5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d"} Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.308026 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nbzls" event={"ID":"9150492e-80b6-40cc-b84f-4631111d863d","Type":"ContainerDied","Data":"529c451324edd8d25d38c54646e90ef1466cd0f9874bb5f2fde0117550fd01a0"} Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.308043 4740 scope.go:117] "RemoveContainer" containerID="5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.308041 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nbzls" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.332857 4740 scope.go:117] "RemoveContainer" containerID="d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.374282 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbzls"] Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.389459 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nbzls"] Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.397570 4740 scope.go:117] "RemoveContainer" containerID="72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.443904 4740 scope.go:117] "RemoveContainer" containerID="5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d" Jan 05 14:24:46 crc kubenswrapper[4740]: E0105 14:24:46.444562 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d\": container with ID starting with 5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d not found: ID does not exist" containerID="5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.444611 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d"} err="failed to get container status \"5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d\": rpc error: code = NotFound desc = could not find container \"5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d\": container with ID starting with 5b977ae7f27a3689c3ee1c62ed98c44f1abfbb53794e073cc9b7aaf28d83718d not found: ID does not exist" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.444643 4740 scope.go:117] "RemoveContainer" containerID="d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af" Jan 05 14:24:46 crc kubenswrapper[4740]: E0105 14:24:46.445020 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af\": container with ID starting with d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af not found: ID does not exist" containerID="d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.445089 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af"} err="failed to get container status \"d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af\": rpc error: code = NotFound desc = could not find container \"d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af\": container with ID starting with d183430068064bf4d98eb2aff131c117935469183dc610b47791a741a04c20af not found: ID does not exist" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.445144 4740 scope.go:117] "RemoveContainer" containerID="72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3" Jan 05 14:24:46 crc kubenswrapper[4740]: E0105 14:24:46.447754 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3\": container with ID starting with 72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3 not found: ID does not exist" containerID="72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.447850 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3"} err="failed to get container status \"72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3\": rpc error: code = NotFound desc = could not find container \"72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3\": container with ID starting with 72ed11824bff79658755daf4601893dde25ecdcdb0f020bb826ab73da1b6ede3 not found: ID does not exist" Jan 05 14:24:46 crc kubenswrapper[4740]: I0105 14:24:46.983538 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9150492e-80b6-40cc-b84f-4631111d863d" path="/var/lib/kubelet/pods/9150492e-80b6-40cc-b84f-4631111d863d/volumes" Jan 05 14:25:01 crc kubenswrapper[4740]: I0105 14:25:01.915497 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:25:01 crc kubenswrapper[4740]: I0105 14:25:01.916391 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:25:19 crc kubenswrapper[4740]: I0105 14:25:19.795606 4740 generic.go:334] "Generic (PLEG): container finished" podID="e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" containerID="b97d1ee9960532035743e8f7456ebdebf99d60a547f231a6f7431d317fa7ddf6" exitCode=0 Jan 05 14:25:19 crc kubenswrapper[4740]: I0105 14:25:19.795654 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" event={"ID":"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb","Type":"ContainerDied","Data":"b97d1ee9960532035743e8f7456ebdebf99d60a547f231a6f7431d317fa7ddf6"} Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.372909 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408520 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408629 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408679 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ssh-key\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408765 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408846 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-bootstrap-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408905 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ovn-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.408933 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-inventory\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409008 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-libvirt-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409035 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gztgz\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-kube-api-access-gztgz\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409088 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-neutron-metadata-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409143 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-nova-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409178 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409206 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-repo-setup-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409287 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409352 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.409385 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-power-monitoring-combined-ca-bundle\") pod \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\" (UID: \"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb\") " Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.416667 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.418629 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.419288 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.419321 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.420127 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.423341 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439134 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439208 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439206 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439312 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439324 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439338 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439342 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-kube-api-access-gztgz" (OuterVolumeSpecName: "kube-api-access-gztgz") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "kube-api-access-gztgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.439379 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.447381 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.447710 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.465226 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-inventory" (OuterVolumeSpecName: "inventory") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.481655 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" (UID: "e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521649 4740 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521685 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521699 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521710 4740 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521722 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gztgz\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-kube-api-access-gztgz\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521732 4740 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521745 4740 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521754 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521763 4740 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521772 4740 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521783 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521792 4740 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521804 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.521812 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.836546 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" event={"ID":"e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb","Type":"ContainerDied","Data":"6ddbcfbd52352b4a2e894a0a1bcff45d6957fd18cea0f9ef68b7b452ef51e4f6"} Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.836603 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ddbcfbd52352b4a2e894a0a1bcff45d6957fd18cea0f9ef68b7b452ef51e4f6" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.836685 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.975851 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j"] Jan 05 14:25:21 crc kubenswrapper[4740]: E0105 14:25:21.976720 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="extract-content" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.976752 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="extract-content" Jan 05 14:25:21 crc kubenswrapper[4740]: E0105 14:25:21.976795 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="extract-utilities" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.976809 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="extract-utilities" Jan 05 14:25:21 crc kubenswrapper[4740]: E0105 14:25:21.976850 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.976865 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 05 14:25:21 crc kubenswrapper[4740]: E0105 14:25:21.976927 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="registry-server" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.976942 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="registry-server" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.977465 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9150492e-80b6-40cc-b84f-4631111d863d" containerName="registry-server" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.977512 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.978970 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.982538 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.983416 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.983841 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.989555 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.990575 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j"] Jan 05 14:25:21 crc kubenswrapper[4740]: I0105 14:25:21.991994 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.035846 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4lx2\" (UniqueName: \"kubernetes.io/projected/237ed6e7-19dd-4f03-8da9-6c43db4798c6-kube-api-access-n4lx2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.035905 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.036055 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.036129 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.036254 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.139297 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.139366 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.139470 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.139564 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4lx2\" (UniqueName: \"kubernetes.io/projected/237ed6e7-19dd-4f03-8da9-6c43db4798c6-kube-api-access-n4lx2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.139603 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.142028 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.145680 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.148374 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.150540 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.158353 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4lx2\" (UniqueName: \"kubernetes.io/projected/237ed6e7-19dd-4f03-8da9-6c43db4798c6-kube-api-access-n4lx2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dl27j\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:22 crc kubenswrapper[4740]: I0105 14:25:22.306901 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:25:23 crc kubenswrapper[4740]: I0105 14:25:23.011563 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j"] Jan 05 14:25:23 crc kubenswrapper[4740]: W0105 14:25:23.025163 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod237ed6e7_19dd_4f03_8da9_6c43db4798c6.slice/crio-3affcd17bc32de4df5f38c381de12922fca74b53902194c68d3e17128854e395 WatchSource:0}: Error finding container 3affcd17bc32de4df5f38c381de12922fca74b53902194c68d3e17128854e395: Status 404 returned error can't find the container with id 3affcd17bc32de4df5f38c381de12922fca74b53902194c68d3e17128854e395 Jan 05 14:25:23 crc kubenswrapper[4740]: I0105 14:25:23.859865 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" event={"ID":"237ed6e7-19dd-4f03-8da9-6c43db4798c6","Type":"ContainerStarted","Data":"fdbb80735eefeb802f4015d21b213056c819174b21b558cf47658f9e5199e6aa"} Jan 05 14:25:23 crc kubenswrapper[4740]: I0105 14:25:23.860135 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" event={"ID":"237ed6e7-19dd-4f03-8da9-6c43db4798c6","Type":"ContainerStarted","Data":"3affcd17bc32de4df5f38c381de12922fca74b53902194c68d3e17128854e395"} Jan 05 14:25:23 crc kubenswrapper[4740]: I0105 14:25:23.880617 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" podStartSLOduration=2.363326916 podStartE2EDuration="2.880602463s" podCreationTimestamp="2026-01-05 14:25:21 +0000 UTC" firstStartedPulling="2026-01-05 14:25:23.028312397 +0000 UTC m=+2172.335220986" lastFinishedPulling="2026-01-05 14:25:23.545587954 +0000 UTC m=+2172.852496533" observedRunningTime="2026-01-05 14:25:23.878058634 +0000 UTC m=+2173.184967213" watchObservedRunningTime="2026-01-05 14:25:23.880602463 +0000 UTC m=+2173.187511042" Jan 05 14:25:31 crc kubenswrapper[4740]: I0105 14:25:31.916235 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:25:31 crc kubenswrapper[4740]: I0105 14:25:31.916833 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:25:32 crc kubenswrapper[4740]: I0105 14:25:32.875462 4740 scope.go:117] "RemoveContainer" containerID="6a50e3ee2c22fcb701f57ab938316fb8e8505f8933589d28b8f45b1c088d327e" Jan 05 14:26:01 crc kubenswrapper[4740]: I0105 14:26:01.916334 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:26:01 crc kubenswrapper[4740]: I0105 14:26:01.917205 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:26:01 crc kubenswrapper[4740]: I0105 14:26:01.917339 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:26:01 crc kubenswrapper[4740]: I0105 14:26:01.919618 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87db5e19923a3a4961dbb7d28184d9af48401a591023a911c3f7453ebbf03439"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:26:01 crc kubenswrapper[4740]: I0105 14:26:01.919791 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://87db5e19923a3a4961dbb7d28184d9af48401a591023a911c3f7453ebbf03439" gracePeriod=600 Jan 05 14:26:02 crc kubenswrapper[4740]: I0105 14:26:02.397727 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="87db5e19923a3a4961dbb7d28184d9af48401a591023a911c3f7453ebbf03439" exitCode=0 Jan 05 14:26:02 crc kubenswrapper[4740]: I0105 14:26:02.397807 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"87db5e19923a3a4961dbb7d28184d9af48401a591023a911c3f7453ebbf03439"} Jan 05 14:26:02 crc kubenswrapper[4740]: I0105 14:26:02.398210 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b"} Jan 05 14:26:02 crc kubenswrapper[4740]: I0105 14:26:02.398247 4740 scope.go:117] "RemoveContainer" containerID="6b3c74a488c8028c6ee3f94cb7f58f42165bfd87b6fe4e1f44e089dcf60e0ca5" Jan 05 14:26:42 crc kubenswrapper[4740]: I0105 14:26:42.041323 4740 generic.go:334] "Generic (PLEG): container finished" podID="237ed6e7-19dd-4f03-8da9-6c43db4798c6" containerID="fdbb80735eefeb802f4015d21b213056c819174b21b558cf47658f9e5199e6aa" exitCode=0 Jan 05 14:26:42 crc kubenswrapper[4740]: I0105 14:26:42.042204 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" event={"ID":"237ed6e7-19dd-4f03-8da9-6c43db4798c6","Type":"ContainerDied","Data":"fdbb80735eefeb802f4015d21b213056c819174b21b558cf47658f9e5199e6aa"} Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.587801 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.652462 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovn-combined-ca-bundle\") pod \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.652886 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ssh-key\") pod \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.653048 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-inventory\") pod \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.653238 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4lx2\" (UniqueName: \"kubernetes.io/projected/237ed6e7-19dd-4f03-8da9-6c43db4798c6-kube-api-access-n4lx2\") pod \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.653493 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovncontroller-config-0\") pod \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\" (UID: \"237ed6e7-19dd-4f03-8da9-6c43db4798c6\") " Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.659423 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "237ed6e7-19dd-4f03-8da9-6c43db4798c6" (UID: "237ed6e7-19dd-4f03-8da9-6c43db4798c6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.663314 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/237ed6e7-19dd-4f03-8da9-6c43db4798c6-kube-api-access-n4lx2" (OuterVolumeSpecName: "kube-api-access-n4lx2") pod "237ed6e7-19dd-4f03-8da9-6c43db4798c6" (UID: "237ed6e7-19dd-4f03-8da9-6c43db4798c6"). InnerVolumeSpecName "kube-api-access-n4lx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.698118 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "237ed6e7-19dd-4f03-8da9-6c43db4798c6" (UID: "237ed6e7-19dd-4f03-8da9-6c43db4798c6"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.702224 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "237ed6e7-19dd-4f03-8da9-6c43db4798c6" (UID: "237ed6e7-19dd-4f03-8da9-6c43db4798c6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.705968 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-inventory" (OuterVolumeSpecName: "inventory") pod "237ed6e7-19dd-4f03-8da9-6c43db4798c6" (UID: "237ed6e7-19dd-4f03-8da9-6c43db4798c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.756899 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.756935 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4lx2\" (UniqueName: \"kubernetes.io/projected/237ed6e7-19dd-4f03-8da9-6c43db4798c6-kube-api-access-n4lx2\") on node \"crc\" DevicePath \"\"" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.756946 4740 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.756955 4740 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:26:43 crc kubenswrapper[4740]: I0105 14:26:43.756964 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/237ed6e7-19dd-4f03-8da9-6c43db4798c6-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.074715 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" event={"ID":"237ed6e7-19dd-4f03-8da9-6c43db4798c6","Type":"ContainerDied","Data":"3affcd17bc32de4df5f38c381de12922fca74b53902194c68d3e17128854e395"} Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.075240 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3affcd17bc32de4df5f38c381de12922fca74b53902194c68d3e17128854e395" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.074792 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dl27j" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.174460 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65"] Jan 05 14:26:44 crc kubenswrapper[4740]: E0105 14:26:44.174987 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="237ed6e7-19dd-4f03-8da9-6c43db4798c6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.175008 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="237ed6e7-19dd-4f03-8da9-6c43db4798c6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.175268 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="237ed6e7-19dd-4f03-8da9-6c43db4798c6" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.176087 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.180318 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.181124 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.181228 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.181301 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.181363 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.181734 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.186694 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65"] Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.271744 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.271852 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.271962 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.272084 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-kube-api-access-5pwrb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.272172 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.272209 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.374037 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-kube-api-access-5pwrb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.374494 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.374745 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.375325 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.375462 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.376717 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.381480 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.386467 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.388565 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.392578 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.394638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-kube-api-access-5pwrb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.401134 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:44 crc kubenswrapper[4740]: I0105 14:26:44.512823 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:26:45 crc kubenswrapper[4740]: I0105 14:26:45.187271 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65"] Jan 05 14:26:46 crc kubenswrapper[4740]: I0105 14:26:46.119394 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" event={"ID":"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867","Type":"ContainerStarted","Data":"017e5cf44c9809172edaa5396f1c00a0d3d6454ab1a062f409b84e50e20441cd"} Jan 05 14:26:46 crc kubenswrapper[4740]: I0105 14:26:46.119727 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" event={"ID":"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867","Type":"ContainerStarted","Data":"395b06461a7d97a5b95cbbc6c2dbf1bac170d0f46196d8ac4ddcefafb5e641df"} Jan 05 14:26:46 crc kubenswrapper[4740]: I0105 14:26:46.136489 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" podStartSLOduration=1.6075644279999999 podStartE2EDuration="2.13647289s" podCreationTimestamp="2026-01-05 14:26:44 +0000 UTC" firstStartedPulling="2026-01-05 14:26:45.197540429 +0000 UTC m=+2254.504449008" lastFinishedPulling="2026-01-05 14:26:45.726448891 +0000 UTC m=+2255.033357470" observedRunningTime="2026-01-05 14:26:46.135399162 +0000 UTC m=+2255.442307751" watchObservedRunningTime="2026-01-05 14:26:46.13647289 +0000 UTC m=+2255.443381479" Jan 05 14:27:46 crc kubenswrapper[4740]: I0105 14:27:46.901284 4740 generic.go:334] "Generic (PLEG): container finished" podID="5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" containerID="017e5cf44c9809172edaa5396f1c00a0d3d6454ab1a062f409b84e50e20441cd" exitCode=0 Jan 05 14:27:46 crc kubenswrapper[4740]: I0105 14:27:46.901341 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" event={"ID":"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867","Type":"ContainerDied","Data":"017e5cf44c9809172edaa5396f1c00a0d3d6454ab1a062f409b84e50e20441cd"} Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.472891 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.543136 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.543182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-nova-metadata-neutron-config-0\") pod \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.543270 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-metadata-combined-ca-bundle\") pod \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.543313 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-kube-api-access-5pwrb\") pod \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.543411 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-ssh-key\") pod \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.554109 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-kube-api-access-5pwrb" (OuterVolumeSpecName: "kube-api-access-5pwrb") pod "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" (UID: "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867"). InnerVolumeSpecName "kube-api-access-5pwrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.564916 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" (UID: "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.579235 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" (UID: "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.580670 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" (UID: "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.589549 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" (UID: "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.645211 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-inventory\") pod \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\" (UID: \"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867\") " Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.645707 4740 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.645727 4740 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.645739 4740 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.645748 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-kube-api-access-5pwrb\") on node \"crc\" DevicePath \"\"" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.645756 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.672760 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-inventory" (OuterVolumeSpecName: "inventory") pod "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" (UID: "5d3a3a3d-0940-4e1f-91cd-2720ec3d7867"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.748169 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d3a3a3d-0940-4e1f-91cd-2720ec3d7867-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.925389 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" event={"ID":"5d3a3a3d-0940-4e1f-91cd-2720ec3d7867","Type":"ContainerDied","Data":"395b06461a7d97a5b95cbbc6c2dbf1bac170d0f46196d8ac4ddcefafb5e641df"} Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.925694 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="395b06461a7d97a5b95cbbc6c2dbf1bac170d0f46196d8ac4ddcefafb5e641df" Jan 05 14:27:48 crc kubenswrapper[4740]: I0105 14:27:48.925447 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.052983 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5"] Jan 05 14:27:49 crc kubenswrapper[4740]: E0105 14:27:49.053668 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.053684 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.054049 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3a3a3d-0940-4e1f-91cd-2720ec3d7867" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.055123 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.064585 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5"] Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.065777 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.065862 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.066048 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.066057 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.066298 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.157206 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.157541 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.157874 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.158156 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmnq\" (UniqueName: \"kubernetes.io/projected/7fd8586e-a676-4220-a8b8-1b75b6d9a789-kube-api-access-ggmnq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.158226 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.260254 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.260392 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.260463 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmnq\" (UniqueName: \"kubernetes.io/projected/7fd8586e-a676-4220-a8b8-1b75b6d9a789-kube-api-access-ggmnq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.260493 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.260524 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.266104 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.268274 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.268447 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.268806 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.278951 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmnq\" (UniqueName: \"kubernetes.io/projected/7fd8586e-a676-4220-a8b8-1b75b6d9a789-kube-api-access-ggmnq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:49 crc kubenswrapper[4740]: I0105 14:27:49.387006 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:27:50 crc kubenswrapper[4740]: I0105 14:27:50.011665 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5"] Jan 05 14:27:50 crc kubenswrapper[4740]: I0105 14:27:50.953451 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" event={"ID":"7fd8586e-a676-4220-a8b8-1b75b6d9a789","Type":"ContainerStarted","Data":"b331222613c2c65e510aa6ce8da9909c74c4240fc459da9c82bdb581ec5ec840"} Jan 05 14:27:51 crc kubenswrapper[4740]: I0105 14:27:51.974541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" event={"ID":"7fd8586e-a676-4220-a8b8-1b75b6d9a789","Type":"ContainerStarted","Data":"b2fc6f1dffc0ad65d7e4367fa44b1a093cd720d17e951e58460b63f1298eb814"} Jan 05 14:27:52 crc kubenswrapper[4740]: I0105 14:27:52.002306 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" podStartSLOduration=2.466340624 podStartE2EDuration="3.002284817s" podCreationTimestamp="2026-01-05 14:27:49 +0000 UTC" firstStartedPulling="2026-01-05 14:27:50.024584619 +0000 UTC m=+2319.331493208" lastFinishedPulling="2026-01-05 14:27:50.560528812 +0000 UTC m=+2319.867437401" observedRunningTime="2026-01-05 14:27:51.996523851 +0000 UTC m=+2321.303432470" watchObservedRunningTime="2026-01-05 14:27:52.002284817 +0000 UTC m=+2321.309193416" Jan 05 14:28:31 crc kubenswrapper[4740]: I0105 14:28:31.915861 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:28:31 crc kubenswrapper[4740]: I0105 14:28:31.916456 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:29:01 crc kubenswrapper[4740]: I0105 14:29:01.916142 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:29:01 crc kubenswrapper[4740]: I0105 14:29:01.916637 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:29:21 crc kubenswrapper[4740]: I0105 14:29:21.745114 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jzknh"] Jan 05 14:29:21 crc kubenswrapper[4740]: I0105 14:29:21.747930 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:21 crc kubenswrapper[4740]: I0105 14:29:21.759383 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jzknh"] Jan 05 14:29:21 crc kubenswrapper[4740]: I0105 14:29:21.926336 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-catalog-content\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:21 crc kubenswrapper[4740]: I0105 14:29:21.926581 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-utilities\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:21 crc kubenswrapper[4740]: I0105 14:29:21.926863 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7sr\" (UniqueName: \"kubernetes.io/projected/c3306080-27c9-401c-8d11-de91bed34937-kube-api-access-nd7sr\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.029139 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd7sr\" (UniqueName: \"kubernetes.io/projected/c3306080-27c9-401c-8d11-de91bed34937-kube-api-access-nd7sr\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.029265 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-catalog-content\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.029425 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-utilities\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.029768 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-catalog-content\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.029994 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-utilities\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.062837 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd7sr\" (UniqueName: \"kubernetes.io/projected/c3306080-27c9-401c-8d11-de91bed34937-kube-api-access-nd7sr\") pod \"redhat-operators-jzknh\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.084487 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:22 crc kubenswrapper[4740]: I0105 14:29:22.583043 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jzknh"] Jan 05 14:29:23 crc kubenswrapper[4740]: I0105 14:29:23.292467 4740 generic.go:334] "Generic (PLEG): container finished" podID="c3306080-27c9-401c-8d11-de91bed34937" containerID="e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e" exitCode=0 Jan 05 14:29:23 crc kubenswrapper[4740]: I0105 14:29:23.292755 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerDied","Data":"e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e"} Jan 05 14:29:23 crc kubenswrapper[4740]: I0105 14:29:23.292779 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerStarted","Data":"e863ecaab31dbbe8f4441d350f311b86cfd87b617ff29848ff50b7d0e28d2cd5"} Jan 05 14:29:23 crc kubenswrapper[4740]: I0105 14:29:23.295054 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:29:25 crc kubenswrapper[4740]: I0105 14:29:25.320921 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerStarted","Data":"4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767"} Jan 05 14:29:28 crc kubenswrapper[4740]: I0105 14:29:28.364203 4740 generic.go:334] "Generic (PLEG): container finished" podID="c3306080-27c9-401c-8d11-de91bed34937" containerID="4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767" exitCode=0 Jan 05 14:29:28 crc kubenswrapper[4740]: I0105 14:29:28.364324 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerDied","Data":"4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767"} Jan 05 14:29:29 crc kubenswrapper[4740]: I0105 14:29:29.377000 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerStarted","Data":"8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2"} Jan 05 14:29:29 crc kubenswrapper[4740]: I0105 14:29:29.408526 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jzknh" podStartSLOduration=2.854522014 podStartE2EDuration="8.408506334s" podCreationTimestamp="2026-01-05 14:29:21 +0000 UTC" firstStartedPulling="2026-01-05 14:29:23.294870894 +0000 UTC m=+2412.601779473" lastFinishedPulling="2026-01-05 14:29:28.848855184 +0000 UTC m=+2418.155763793" observedRunningTime="2026-01-05 14:29:29.399896081 +0000 UTC m=+2418.706804660" watchObservedRunningTime="2026-01-05 14:29:29.408506334 +0000 UTC m=+2418.715414913" Jan 05 14:29:31 crc kubenswrapper[4740]: I0105 14:29:31.916034 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:29:31 crc kubenswrapper[4740]: I0105 14:29:31.916428 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:29:31 crc kubenswrapper[4740]: I0105 14:29:31.916475 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:29:31 crc kubenswrapper[4740]: I0105 14:29:31.917097 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:29:31 crc kubenswrapper[4740]: I0105 14:29:31.917163 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" gracePeriod=600 Jan 05 14:29:32 crc kubenswrapper[4740]: E0105 14:29:32.044630 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:29:32 crc kubenswrapper[4740]: I0105 14:29:32.084919 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:32 crc kubenswrapper[4740]: I0105 14:29:32.084982 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:32 crc kubenswrapper[4740]: I0105 14:29:32.437122 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" exitCode=0 Jan 05 14:29:32 crc kubenswrapper[4740]: I0105 14:29:32.437189 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b"} Jan 05 14:29:32 crc kubenswrapper[4740]: I0105 14:29:32.437241 4740 scope.go:117] "RemoveContainer" containerID="87db5e19923a3a4961dbb7d28184d9af48401a591023a911c3f7453ebbf03439" Jan 05 14:29:32 crc kubenswrapper[4740]: I0105 14:29:32.438571 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:29:32 crc kubenswrapper[4740]: E0105 14:29:32.439497 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:29:33 crc kubenswrapper[4740]: I0105 14:29:33.136261 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jzknh" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="registry-server" probeResult="failure" output=< Jan 05 14:29:33 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:29:33 crc kubenswrapper[4740]: > Jan 05 14:29:42 crc kubenswrapper[4740]: I0105 14:29:42.147473 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:42 crc kubenswrapper[4740]: I0105 14:29:42.226935 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:42 crc kubenswrapper[4740]: I0105 14:29:42.402924 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jzknh"] Jan 05 14:29:43 crc kubenswrapper[4740]: I0105 14:29:43.606897 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jzknh" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="registry-server" containerID="cri-o://8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2" gracePeriod=2 Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.170386 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.259759 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-catalog-content\") pod \"c3306080-27c9-401c-8d11-de91bed34937\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.259966 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-utilities\") pod \"c3306080-27c9-401c-8d11-de91bed34937\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.260175 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd7sr\" (UniqueName: \"kubernetes.io/projected/c3306080-27c9-401c-8d11-de91bed34937-kube-api-access-nd7sr\") pod \"c3306080-27c9-401c-8d11-de91bed34937\" (UID: \"c3306080-27c9-401c-8d11-de91bed34937\") " Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.260833 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-utilities" (OuterVolumeSpecName: "utilities") pod "c3306080-27c9-401c-8d11-de91bed34937" (UID: "c3306080-27c9-401c-8d11-de91bed34937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.273960 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3306080-27c9-401c-8d11-de91bed34937-kube-api-access-nd7sr" (OuterVolumeSpecName: "kube-api-access-nd7sr") pod "c3306080-27c9-401c-8d11-de91bed34937" (UID: "c3306080-27c9-401c-8d11-de91bed34937"). InnerVolumeSpecName "kube-api-access-nd7sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.363309 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.363361 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd7sr\" (UniqueName: \"kubernetes.io/projected/c3306080-27c9-401c-8d11-de91bed34937-kube-api-access-nd7sr\") on node \"crc\" DevicePath \"\"" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.384001 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3306080-27c9-401c-8d11-de91bed34937" (UID: "c3306080-27c9-401c-8d11-de91bed34937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.465353 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3306080-27c9-401c-8d11-de91bed34937-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.620413 4740 generic.go:334] "Generic (PLEG): container finished" podID="c3306080-27c9-401c-8d11-de91bed34937" containerID="8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2" exitCode=0 Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.620472 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerDied","Data":"8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2"} Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.620501 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzknh" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.620544 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzknh" event={"ID":"c3306080-27c9-401c-8d11-de91bed34937","Type":"ContainerDied","Data":"e863ecaab31dbbe8f4441d350f311b86cfd87b617ff29848ff50b7d0e28d2cd5"} Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.620576 4740 scope.go:117] "RemoveContainer" containerID="8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.664169 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jzknh"] Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.666052 4740 scope.go:117] "RemoveContainer" containerID="4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.676829 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jzknh"] Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.701798 4740 scope.go:117] "RemoveContainer" containerID="e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.748791 4740 scope.go:117] "RemoveContainer" containerID="8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2" Jan 05 14:29:44 crc kubenswrapper[4740]: E0105 14:29:44.749284 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2\": container with ID starting with 8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2 not found: ID does not exist" containerID="8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.749330 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2"} err="failed to get container status \"8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2\": rpc error: code = NotFound desc = could not find container \"8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2\": container with ID starting with 8ada9e4405c279d274559e69d258ee02097b07b04614d8bb7a6388b3ac082df2 not found: ID does not exist" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.749365 4740 scope.go:117] "RemoveContainer" containerID="4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767" Jan 05 14:29:44 crc kubenswrapper[4740]: E0105 14:29:44.749718 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767\": container with ID starting with 4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767 not found: ID does not exist" containerID="4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.749774 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767"} err="failed to get container status \"4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767\": rpc error: code = NotFound desc = could not find container \"4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767\": container with ID starting with 4b35f8f3eabd518c0a7aa645f02d6014e9bdc96091918874ed55ae51d7b2c767 not found: ID does not exist" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.749808 4740 scope.go:117] "RemoveContainer" containerID="e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e" Jan 05 14:29:44 crc kubenswrapper[4740]: E0105 14:29:44.750304 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e\": container with ID starting with e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e not found: ID does not exist" containerID="e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.750400 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e"} err="failed to get container status \"e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e\": rpc error: code = NotFound desc = could not find container \"e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e\": container with ID starting with e1fe50f4594c17636f08d5f118f222d5bc2b5f758c0cc1114ca46056f60dee8e not found: ID does not exist" Jan 05 14:29:44 crc kubenswrapper[4740]: I0105 14:29:44.985787 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3306080-27c9-401c-8d11-de91bed34937" path="/var/lib/kubelet/pods/c3306080-27c9-401c-8d11-de91bed34937/volumes" Jan 05 14:29:47 crc kubenswrapper[4740]: I0105 14:29:47.970344 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:29:47 crc kubenswrapper[4740]: E0105 14:29:47.970985 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:29:58 crc kubenswrapper[4740]: I0105 14:29:58.968637 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:29:58 crc kubenswrapper[4740]: E0105 14:29:58.969704 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.161223 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4"] Jan 05 14:30:00 crc kubenswrapper[4740]: E0105 14:30:00.162421 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="extract-utilities" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.162444 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="extract-utilities" Jan 05 14:30:00 crc kubenswrapper[4740]: E0105 14:30:00.162511 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="extract-content" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.162525 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="extract-content" Jan 05 14:30:00 crc kubenswrapper[4740]: E0105 14:30:00.162553 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="registry-server" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.162566 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="registry-server" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.163019 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3306080-27c9-401c-8d11-de91bed34937" containerName="registry-server" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.164598 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.167260 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.167889 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.202540 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4"] Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.300840 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a74ca1f5-6afe-4622-9306-f779d6f7dda3-config-volume\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.300886 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a74ca1f5-6afe-4622-9306-f779d6f7dda3-secret-volume\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.300967 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbggd\" (UniqueName: \"kubernetes.io/projected/a74ca1f5-6afe-4622-9306-f779d6f7dda3-kube-api-access-dbggd\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.402935 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a74ca1f5-6afe-4622-9306-f779d6f7dda3-config-volume\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.402992 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a74ca1f5-6afe-4622-9306-f779d6f7dda3-secret-volume\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.403112 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbggd\" (UniqueName: \"kubernetes.io/projected/a74ca1f5-6afe-4622-9306-f779d6f7dda3-kube-api-access-dbggd\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.403931 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a74ca1f5-6afe-4622-9306-f779d6f7dda3-config-volume\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.424953 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a74ca1f5-6afe-4622-9306-f779d6f7dda3-secret-volume\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.441911 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbggd\" (UniqueName: \"kubernetes.io/projected/a74ca1f5-6afe-4622-9306-f779d6f7dda3-kube-api-access-dbggd\") pod \"collect-profiles-29460390-bbdh4\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:00 crc kubenswrapper[4740]: I0105 14:30:00.506619 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:01 crc kubenswrapper[4740]: I0105 14:30:01.033387 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4"] Jan 05 14:30:01 crc kubenswrapper[4740]: W0105 14:30:01.048130 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74ca1f5_6afe_4622_9306_f779d6f7dda3.slice/crio-50a4b86ae76ba27effbf35a7375aa2c31e0cc26ad9a9aee7e11f61406d8b666a WatchSource:0}: Error finding container 50a4b86ae76ba27effbf35a7375aa2c31e0cc26ad9a9aee7e11f61406d8b666a: Status 404 returned error can't find the container with id 50a4b86ae76ba27effbf35a7375aa2c31e0cc26ad9a9aee7e11f61406d8b666a Jan 05 14:30:01 crc kubenswrapper[4740]: I0105 14:30:01.829287 4740 generic.go:334] "Generic (PLEG): container finished" podID="a74ca1f5-6afe-4622-9306-f779d6f7dda3" containerID="e2357239d27bc636bc43697a42f98c8365edc9e8fb1eb001919d63a3fc4372e0" exitCode=0 Jan 05 14:30:01 crc kubenswrapper[4740]: I0105 14:30:01.829349 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" event={"ID":"a74ca1f5-6afe-4622-9306-f779d6f7dda3","Type":"ContainerDied","Data":"e2357239d27bc636bc43697a42f98c8365edc9e8fb1eb001919d63a3fc4372e0"} Jan 05 14:30:01 crc kubenswrapper[4740]: I0105 14:30:01.829688 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" event={"ID":"a74ca1f5-6afe-4622-9306-f779d6f7dda3","Type":"ContainerStarted","Data":"50a4b86ae76ba27effbf35a7375aa2c31e0cc26ad9a9aee7e11f61406d8b666a"} Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.359355 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.484424 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbggd\" (UniqueName: \"kubernetes.io/projected/a74ca1f5-6afe-4622-9306-f779d6f7dda3-kube-api-access-dbggd\") pod \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.484929 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a74ca1f5-6afe-4622-9306-f779d6f7dda3-secret-volume\") pod \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.485182 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a74ca1f5-6afe-4622-9306-f779d6f7dda3-config-volume\") pod \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\" (UID: \"a74ca1f5-6afe-4622-9306-f779d6f7dda3\") " Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.486058 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74ca1f5-6afe-4622-9306-f779d6f7dda3-config-volume" (OuterVolumeSpecName: "config-volume") pod "a74ca1f5-6afe-4622-9306-f779d6f7dda3" (UID: "a74ca1f5-6afe-4622-9306-f779d6f7dda3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.490478 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74ca1f5-6afe-4622-9306-f779d6f7dda3-kube-api-access-dbggd" (OuterVolumeSpecName: "kube-api-access-dbggd") pod "a74ca1f5-6afe-4622-9306-f779d6f7dda3" (UID: "a74ca1f5-6afe-4622-9306-f779d6f7dda3"). InnerVolumeSpecName "kube-api-access-dbggd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.491109 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74ca1f5-6afe-4622-9306-f779d6f7dda3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a74ca1f5-6afe-4622-9306-f779d6f7dda3" (UID: "a74ca1f5-6afe-4622-9306-f779d6f7dda3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.587550 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a74ca1f5-6afe-4622-9306-f779d6f7dda3-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.587577 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbggd\" (UniqueName: \"kubernetes.io/projected/a74ca1f5-6afe-4622-9306-f779d6f7dda3-kube-api-access-dbggd\") on node \"crc\" DevicePath \"\"" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.587587 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a74ca1f5-6afe-4622-9306-f779d6f7dda3-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.855748 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" event={"ID":"a74ca1f5-6afe-4622-9306-f779d6f7dda3","Type":"ContainerDied","Data":"50a4b86ae76ba27effbf35a7375aa2c31e0cc26ad9a9aee7e11f61406d8b666a"} Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.855788 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a4b86ae76ba27effbf35a7375aa2c31e0cc26ad9a9aee7e11f61406d8b666a" Jan 05 14:30:03 crc kubenswrapper[4740]: I0105 14:30:03.855852 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4" Jan 05 14:30:04 crc kubenswrapper[4740]: I0105 14:30:04.470341 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw"] Jan 05 14:30:04 crc kubenswrapper[4740]: I0105 14:30:04.481190 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460345-f27lw"] Jan 05 14:30:04 crc kubenswrapper[4740]: I0105 14:30:04.983058 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="409599e0-5f32-4b72-9c6a-73c9d9d4cc63" path="/var/lib/kubelet/pods/409599e0-5f32-4b72-9c6a-73c9d9d4cc63/volumes" Jan 05 14:30:09 crc kubenswrapper[4740]: I0105 14:30:09.968822 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:30:09 crc kubenswrapper[4740]: E0105 14:30:09.969646 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:30:20 crc kubenswrapper[4740]: I0105 14:30:20.984650 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:30:20 crc kubenswrapper[4740]: E0105 14:30:20.985726 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:30:33 crc kubenswrapper[4740]: I0105 14:30:33.141194 4740 scope.go:117] "RemoveContainer" containerID="6376a4ccdd7d44d6d9831a5b4572a41715dbcb7f2d9c609a98891f8118d7abee" Jan 05 14:30:34 crc kubenswrapper[4740]: I0105 14:30:34.968566 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:30:34 crc kubenswrapper[4740]: E0105 14:30:34.969541 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:30:45 crc kubenswrapper[4740]: I0105 14:30:45.968954 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:30:45 crc kubenswrapper[4740]: E0105 14:30:45.970680 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:30:59 crc kubenswrapper[4740]: I0105 14:30:59.969460 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:30:59 crc kubenswrapper[4740]: E0105 14:30:59.970824 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:31:14 crc kubenswrapper[4740]: I0105 14:31:14.969457 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:31:14 crc kubenswrapper[4740]: E0105 14:31:14.970319 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:31:28 crc kubenswrapper[4740]: I0105 14:31:28.969878 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:31:28 crc kubenswrapper[4740]: E0105 14:31:28.971431 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:31:39 crc kubenswrapper[4740]: I0105 14:31:39.969126 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:31:39 crc kubenswrapper[4740]: E0105 14:31:39.970076 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:31:52 crc kubenswrapper[4740]: I0105 14:31:52.969405 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:31:52 crc kubenswrapper[4740]: E0105 14:31:52.970746 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:32:03 crc kubenswrapper[4740]: I0105 14:32:03.969302 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:32:03 crc kubenswrapper[4740]: E0105 14:32:03.970335 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:32:18 crc kubenswrapper[4740]: I0105 14:32:18.968879 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:32:18 crc kubenswrapper[4740]: E0105 14:32:18.970854 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.690200 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wpl6k"] Jan 05 14:32:25 crc kubenswrapper[4740]: E0105 14:32:25.691378 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74ca1f5-6afe-4622-9306-f779d6f7dda3" containerName="collect-profiles" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.691396 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74ca1f5-6afe-4622-9306-f779d6f7dda3" containerName="collect-profiles" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.691671 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74ca1f5-6afe-4622-9306-f779d6f7dda3" containerName="collect-profiles" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.697694 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.729812 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wpl6k"] Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.815213 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-catalog-content\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.815404 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-utilities\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.815436 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjz7\" (UniqueName: \"kubernetes.io/projected/ce45fa02-824a-4806-8514-6b5eb9ac4627-kube-api-access-qqjz7\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.918204 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-catalog-content\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.918485 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-utilities\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.918528 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjz7\" (UniqueName: \"kubernetes.io/projected/ce45fa02-824a-4806-8514-6b5eb9ac4627-kube-api-access-qqjz7\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.918773 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-catalog-content\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.918901 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-utilities\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:25 crc kubenswrapper[4740]: I0105 14:32:25.951223 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjz7\" (UniqueName: \"kubernetes.io/projected/ce45fa02-824a-4806-8514-6b5eb9ac4627-kube-api-access-qqjz7\") pod \"certified-operators-wpl6k\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:26 crc kubenswrapper[4740]: I0105 14:32:26.023153 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:26 crc kubenswrapper[4740]: I0105 14:32:26.647997 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wpl6k"] Jan 05 14:32:27 crc kubenswrapper[4740]: I0105 14:32:27.017077 4740 generic.go:334] "Generic (PLEG): container finished" podID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerID="6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f" exitCode=0 Jan 05 14:32:27 crc kubenswrapper[4740]: I0105 14:32:27.017168 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerDied","Data":"6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f"} Jan 05 14:32:27 crc kubenswrapper[4740]: I0105 14:32:27.019333 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerStarted","Data":"97375d0708178b509f79e1526266caff28512aee2ed3e9dcea43c792bdc7201f"} Jan 05 14:32:28 crc kubenswrapper[4740]: I0105 14:32:28.029414 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerStarted","Data":"fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1"} Jan 05 14:32:29 crc kubenswrapper[4740]: I0105 14:32:29.042049 4740 generic.go:334] "Generic (PLEG): container finished" podID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerID="fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1" exitCode=0 Jan 05 14:32:29 crc kubenswrapper[4740]: I0105 14:32:29.042128 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerDied","Data":"fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1"} Jan 05 14:32:31 crc kubenswrapper[4740]: I0105 14:32:31.077282 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerStarted","Data":"dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218"} Jan 05 14:32:31 crc kubenswrapper[4740]: I0105 14:32:31.111399 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wpl6k" podStartSLOduration=3.087304349 podStartE2EDuration="6.111377212s" podCreationTimestamp="2026-01-05 14:32:25 +0000 UTC" firstStartedPulling="2026-01-05 14:32:27.019529913 +0000 UTC m=+2596.326438492" lastFinishedPulling="2026-01-05 14:32:30.043602746 +0000 UTC m=+2599.350511355" observedRunningTime="2026-01-05 14:32:31.102637355 +0000 UTC m=+2600.409545934" watchObservedRunningTime="2026-01-05 14:32:31.111377212 +0000 UTC m=+2600.418285801" Jan 05 14:32:32 crc kubenswrapper[4740]: I0105 14:32:32.969643 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:32:32 crc kubenswrapper[4740]: E0105 14:32:32.970651 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:32:35 crc kubenswrapper[4740]: I0105 14:32:35.121652 4740 generic.go:334] "Generic (PLEG): container finished" podID="7fd8586e-a676-4220-a8b8-1b75b6d9a789" containerID="b2fc6f1dffc0ad65d7e4367fa44b1a093cd720d17e951e58460b63f1298eb814" exitCode=0 Jan 05 14:32:35 crc kubenswrapper[4740]: I0105 14:32:35.121871 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" event={"ID":"7fd8586e-a676-4220-a8b8-1b75b6d9a789","Type":"ContainerDied","Data":"b2fc6f1dffc0ad65d7e4367fa44b1a093cd720d17e951e58460b63f1298eb814"} Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.023603 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.024032 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.117370 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.220911 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.387415 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wpl6k"] Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.756141 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.810878 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-secret-0\") pod \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.810962 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-inventory\") pod \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.811051 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-combined-ca-bundle\") pod \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.811151 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggmnq\" (UniqueName: \"kubernetes.io/projected/7fd8586e-a676-4220-a8b8-1b75b6d9a789-kube-api-access-ggmnq\") pod \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.811232 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-ssh-key\") pod \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\" (UID: \"7fd8586e-a676-4220-a8b8-1b75b6d9a789\") " Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.818148 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd8586e-a676-4220-a8b8-1b75b6d9a789-kube-api-access-ggmnq" (OuterVolumeSpecName: "kube-api-access-ggmnq") pod "7fd8586e-a676-4220-a8b8-1b75b6d9a789" (UID: "7fd8586e-a676-4220-a8b8-1b75b6d9a789"). InnerVolumeSpecName "kube-api-access-ggmnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.818575 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7fd8586e-a676-4220-a8b8-1b75b6d9a789" (UID: "7fd8586e-a676-4220-a8b8-1b75b6d9a789"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.844695 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7fd8586e-a676-4220-a8b8-1b75b6d9a789" (UID: "7fd8586e-a676-4220-a8b8-1b75b6d9a789"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.852666 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-inventory" (OuterVolumeSpecName: "inventory") pod "7fd8586e-a676-4220-a8b8-1b75b6d9a789" (UID: "7fd8586e-a676-4220-a8b8-1b75b6d9a789"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.874214 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7fd8586e-a676-4220-a8b8-1b75b6d9a789" (UID: "7fd8586e-a676-4220-a8b8-1b75b6d9a789"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.912718 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.912744 4740 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.912755 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.912764 4740 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd8586e-a676-4220-a8b8-1b75b6d9a789-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:36 crc kubenswrapper[4740]: I0105 14:32:36.912774 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggmnq\" (UniqueName: \"kubernetes.io/projected/7fd8586e-a676-4220-a8b8-1b75b6d9a789-kube-api-access-ggmnq\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.152036 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" event={"ID":"7fd8586e-a676-4220-a8b8-1b75b6d9a789","Type":"ContainerDied","Data":"b331222613c2c65e510aa6ce8da9909c74c4240fc459da9c82bdb581ec5ec840"} Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.152137 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b331222613c2c65e510aa6ce8da9909c74c4240fc459da9c82bdb581ec5ec840" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.152088 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.274945 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb"] Jan 05 14:32:37 crc kubenswrapper[4740]: E0105 14:32:37.275386 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd8586e-a676-4220-a8b8-1b75b6d9a789" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.275399 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd8586e-a676-4220-a8b8-1b75b6d9a789" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.275654 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd8586e-a676-4220-a8b8-1b75b6d9a789" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.276410 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.290787 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.292145 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.295622 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.295811 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.296287 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.296442 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.296687 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.299038 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb"] Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.323898 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87dz\" (UniqueName: \"kubernetes.io/projected/0a84d8e6-06d0-4645-b42f-a77963c58987-kube-api-access-r87dz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.324257 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.324409 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.324544 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.324707 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.324823 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.325015 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.325263 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.325653 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.427894 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.428284 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.428401 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.428546 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.428686 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.428937 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.429036 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87dz\" (UniqueName: \"kubernetes.io/projected/0a84d8e6-06d0-4645-b42f-a77963c58987-kube-api-access-r87dz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.429242 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.429386 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.430598 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.434305 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.434538 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.435121 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.435341 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.436374 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.438169 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.444515 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.451421 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87dz\" (UniqueName: \"kubernetes.io/projected/0a84d8e6-06d0-4645-b42f-a77963c58987-kube-api-access-r87dz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xg2sb\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:37 crc kubenswrapper[4740]: I0105 14:32:37.599640 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.164886 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wpl6k" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="registry-server" containerID="cri-o://dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218" gracePeriod=2 Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.805774 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.861569 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjz7\" (UniqueName: \"kubernetes.io/projected/ce45fa02-824a-4806-8514-6b5eb9ac4627-kube-api-access-qqjz7\") pod \"ce45fa02-824a-4806-8514-6b5eb9ac4627\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.861642 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-catalog-content\") pod \"ce45fa02-824a-4806-8514-6b5eb9ac4627\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.861698 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-utilities\") pod \"ce45fa02-824a-4806-8514-6b5eb9ac4627\" (UID: \"ce45fa02-824a-4806-8514-6b5eb9ac4627\") " Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.863244 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-utilities" (OuterVolumeSpecName: "utilities") pod "ce45fa02-824a-4806-8514-6b5eb9ac4627" (UID: "ce45fa02-824a-4806-8514-6b5eb9ac4627"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.889734 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce45fa02-824a-4806-8514-6b5eb9ac4627-kube-api-access-qqjz7" (OuterVolumeSpecName: "kube-api-access-qqjz7") pod "ce45fa02-824a-4806-8514-6b5eb9ac4627" (UID: "ce45fa02-824a-4806-8514-6b5eb9ac4627"). InnerVolumeSpecName "kube-api-access-qqjz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.940771 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce45fa02-824a-4806-8514-6b5eb9ac4627" (UID: "ce45fa02-824a-4806-8514-6b5eb9ac4627"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.964635 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjz7\" (UniqueName: \"kubernetes.io/projected/ce45fa02-824a-4806-8514-6b5eb9ac4627-kube-api-access-qqjz7\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.964664 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:38 crc kubenswrapper[4740]: I0105 14:32:38.964673 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce45fa02-824a-4806-8514-6b5eb9ac4627-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.179265 4740 generic.go:334] "Generic (PLEG): container finished" podID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerID="dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218" exitCode=0 Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.179300 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerDied","Data":"dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218"} Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.179324 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wpl6k" event={"ID":"ce45fa02-824a-4806-8514-6b5eb9ac4627","Type":"ContainerDied","Data":"97375d0708178b509f79e1526266caff28512aee2ed3e9dcea43c792bdc7201f"} Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.179339 4740 scope.go:117] "RemoveContainer" containerID="dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.179381 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wpl6k" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.224024 4740 scope.go:117] "RemoveContainer" containerID="fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.224702 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wpl6k"] Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.242541 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wpl6k"] Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.259337 4740 scope.go:117] "RemoveContainer" containerID="6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.325057 4740 scope.go:117] "RemoveContainer" containerID="dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218" Jan 05 14:32:39 crc kubenswrapper[4740]: E0105 14:32:39.326540 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218\": container with ID starting with dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218 not found: ID does not exist" containerID="dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.326578 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218"} err="failed to get container status \"dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218\": rpc error: code = NotFound desc = could not find container \"dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218\": container with ID starting with dec3b34fbfb9e9622c0f0c428b233e75a5c52ea512691d195d1ddf797a578218 not found: ID does not exist" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.326600 4740 scope.go:117] "RemoveContainer" containerID="fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1" Jan 05 14:32:39 crc kubenswrapper[4740]: E0105 14:32:39.327004 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1\": container with ID starting with fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1 not found: ID does not exist" containerID="fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.327059 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1"} err="failed to get container status \"fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1\": rpc error: code = NotFound desc = could not find container \"fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1\": container with ID starting with fd4fbd74dd2f0ca6eff681c1f6f5800d8b46cfb1e64e1cd0b3952721eec200c1 not found: ID does not exist" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.327173 4740 scope.go:117] "RemoveContainer" containerID="6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f" Jan 05 14:32:39 crc kubenswrapper[4740]: E0105 14:32:39.327520 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f\": container with ID starting with 6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f not found: ID does not exist" containerID="6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.327551 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f"} err="failed to get container status \"6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f\": rpc error: code = NotFound desc = could not find container \"6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f\": container with ID starting with 6777223f3abc8be182b453661d85f3332ed1b3875b46e4b2eca126a5e651752f not found: ID does not exist" Jan 05 14:32:39 crc kubenswrapper[4740]: I0105 14:32:39.873877 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb"] Jan 05 14:32:39 crc kubenswrapper[4740]: W0105 14:32:39.883391 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a84d8e6_06d0_4645_b42f_a77963c58987.slice/crio-3e002026d58c49ee411412cd47b388076677a3f979c10f8707b8a60282b4055f WatchSource:0}: Error finding container 3e002026d58c49ee411412cd47b388076677a3f979c10f8707b8a60282b4055f: Status 404 returned error can't find the container with id 3e002026d58c49ee411412cd47b388076677a3f979c10f8707b8a60282b4055f Jan 05 14:32:40 crc kubenswrapper[4740]: I0105 14:32:40.192586 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" event={"ID":"0a84d8e6-06d0-4645-b42f-a77963c58987","Type":"ContainerStarted","Data":"3e002026d58c49ee411412cd47b388076677a3f979c10f8707b8a60282b4055f"} Jan 05 14:32:40 crc kubenswrapper[4740]: I0105 14:32:40.993449 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" path="/var/lib/kubelet/pods/ce45fa02-824a-4806-8514-6b5eb9ac4627/volumes" Jan 05 14:32:41 crc kubenswrapper[4740]: I0105 14:32:41.209145 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" event={"ID":"0a84d8e6-06d0-4645-b42f-a77963c58987","Type":"ContainerStarted","Data":"b1aac9321632dcb0da6836d6990acde216eb33ea59eae118acb90c3124783116"} Jan 05 14:32:41 crc kubenswrapper[4740]: I0105 14:32:41.229421 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" podStartSLOduration=3.745873584 podStartE2EDuration="4.229406214s" podCreationTimestamp="2026-01-05 14:32:37 +0000 UTC" firstStartedPulling="2026-01-05 14:32:39.885709172 +0000 UTC m=+2609.192617751" lastFinishedPulling="2026-01-05 14:32:40.369241792 +0000 UTC m=+2609.676150381" observedRunningTime="2026-01-05 14:32:41.228021597 +0000 UTC m=+2610.534930206" watchObservedRunningTime="2026-01-05 14:32:41.229406214 +0000 UTC m=+2610.536314783" Jan 05 14:32:43 crc kubenswrapper[4740]: I0105 14:32:43.968036 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:32:43 crc kubenswrapper[4740]: E0105 14:32:43.969008 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:32:55 crc kubenswrapper[4740]: I0105 14:32:55.968769 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:32:55 crc kubenswrapper[4740]: E0105 14:32:55.969525 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:33:08 crc kubenswrapper[4740]: I0105 14:33:08.969424 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:33:08 crc kubenswrapper[4740]: E0105 14:33:08.970282 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:33:22 crc kubenswrapper[4740]: I0105 14:33:22.968972 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:33:22 crc kubenswrapper[4740]: E0105 14:33:22.970055 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:33:33 crc kubenswrapper[4740]: I0105 14:33:33.968345 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:33:33 crc kubenswrapper[4740]: E0105 14:33:33.969237 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:33:47 crc kubenswrapper[4740]: I0105 14:33:47.968910 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:33:47 crc kubenswrapper[4740]: E0105 14:33:47.969661 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:33:59 crc kubenswrapper[4740]: I0105 14:33:59.969897 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:33:59 crc kubenswrapper[4740]: E0105 14:33:59.972554 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:34:10 crc kubenswrapper[4740]: I0105 14:34:10.985303 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:34:10 crc kubenswrapper[4740]: E0105 14:34:10.986125 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:34:24 crc kubenswrapper[4740]: I0105 14:34:24.969535 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:34:24 crc kubenswrapper[4740]: E0105 14:34:24.970616 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:34:38 crc kubenswrapper[4740]: I0105 14:34:38.968388 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:34:39 crc kubenswrapper[4740]: I0105 14:34:39.741351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"97adcb1f23098a3ed5375417ea1f2f00fd3cab92f0c37ade5cce5e918ccb2aef"} Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.252866 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-249zz"] Jan 05 14:35:19 crc kubenswrapper[4740]: E0105 14:35:19.253801 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="extract-utilities" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.253814 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="extract-utilities" Jan 05 14:35:19 crc kubenswrapper[4740]: E0105 14:35:19.253827 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="registry-server" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.253833 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="registry-server" Jan 05 14:35:19 crc kubenswrapper[4740]: E0105 14:35:19.253862 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="extract-content" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.253868 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="extract-content" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.254113 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce45fa02-824a-4806-8514-6b5eb9ac4627" containerName="registry-server" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.255782 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.266893 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-249zz"] Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.359618 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-catalog-content\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.359713 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-utilities\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.359767 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mngz6\" (UniqueName: \"kubernetes.io/projected/0abbb64c-21db-46dc-94d2-618e3bdff03f-kube-api-access-mngz6\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.463139 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-catalog-content\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.463468 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-utilities\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.463523 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mngz6\" (UniqueName: \"kubernetes.io/projected/0abbb64c-21db-46dc-94d2-618e3bdff03f-kube-api-access-mngz6\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.463624 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-catalog-content\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.463774 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-utilities\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.485342 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mngz6\" (UniqueName: \"kubernetes.io/projected/0abbb64c-21db-46dc-94d2-618e3bdff03f-kube-api-access-mngz6\") pod \"community-operators-249zz\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:19 crc kubenswrapper[4740]: I0105 14:35:19.583598 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:20 crc kubenswrapper[4740]: I0105 14:35:20.098424 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-249zz"] Jan 05 14:35:20 crc kubenswrapper[4740]: I0105 14:35:20.282904 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerStarted","Data":"6d1a98bfdecf88192848917e91cc0132cecccb6f53009bd26109ad81e3f84b0a"} Jan 05 14:35:21 crc kubenswrapper[4740]: I0105 14:35:21.298316 4740 generic.go:334] "Generic (PLEG): container finished" podID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerID="c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9" exitCode=0 Jan 05 14:35:21 crc kubenswrapper[4740]: I0105 14:35:21.298432 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerDied","Data":"c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9"} Jan 05 14:35:21 crc kubenswrapper[4740]: I0105 14:35:21.302298 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.332809 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerStarted","Data":"062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0"} Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.615134 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5frhx"] Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.624970 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.651276 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5frhx"] Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.790716 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-utilities\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.790868 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-catalog-content\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.790951 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthk5\" (UniqueName: \"kubernetes.io/projected/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-kube-api-access-tthk5\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.893027 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-utilities\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.893228 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-catalog-content\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.893334 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthk5\" (UniqueName: \"kubernetes.io/projected/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-kube-api-access-tthk5\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.893719 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-utilities\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.893815 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-catalog-content\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.925820 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthk5\" (UniqueName: \"kubernetes.io/projected/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-kube-api-access-tthk5\") pod \"redhat-marketplace-5frhx\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:23 crc kubenswrapper[4740]: I0105 14:35:23.963849 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:24 crc kubenswrapper[4740]: I0105 14:35:24.347322 4740 generic.go:334] "Generic (PLEG): container finished" podID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerID="062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0" exitCode=0 Jan 05 14:35:24 crc kubenswrapper[4740]: I0105 14:35:24.347536 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerDied","Data":"062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0"} Jan 05 14:35:24 crc kubenswrapper[4740]: I0105 14:35:24.575638 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5frhx"] Jan 05 14:35:24 crc kubenswrapper[4740]: W0105 14:35:24.577949 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0698bb2_ffbb_4f7c_a98d_1d11735ee4a3.slice/crio-dc7c5af059ffa8e7bde1cf0b609575bbad37b18e44e2a787f5e586e0b6b0ea31 WatchSource:0}: Error finding container dc7c5af059ffa8e7bde1cf0b609575bbad37b18e44e2a787f5e586e0b6b0ea31: Status 404 returned error can't find the container with id dc7c5af059ffa8e7bde1cf0b609575bbad37b18e44e2a787f5e586e0b6b0ea31 Jan 05 14:35:25 crc kubenswrapper[4740]: I0105 14:35:25.363133 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerStarted","Data":"c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d"} Jan 05 14:35:25 crc kubenswrapper[4740]: I0105 14:35:25.366563 4740 generic.go:334] "Generic (PLEG): container finished" podID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerID="e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da" exitCode=0 Jan 05 14:35:25 crc kubenswrapper[4740]: I0105 14:35:25.366601 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerDied","Data":"e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da"} Jan 05 14:35:25 crc kubenswrapper[4740]: I0105 14:35:25.366622 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerStarted","Data":"dc7c5af059ffa8e7bde1cf0b609575bbad37b18e44e2a787f5e586e0b6b0ea31"} Jan 05 14:35:25 crc kubenswrapper[4740]: I0105 14:35:25.423594 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-249zz" podStartSLOduration=2.8287539219999998 podStartE2EDuration="6.423575914s" podCreationTimestamp="2026-01-05 14:35:19 +0000 UTC" firstStartedPulling="2026-01-05 14:35:21.301963626 +0000 UTC m=+2770.608872205" lastFinishedPulling="2026-01-05 14:35:24.896785618 +0000 UTC m=+2774.203694197" observedRunningTime="2026-01-05 14:35:25.407486768 +0000 UTC m=+2774.714395367" watchObservedRunningTime="2026-01-05 14:35:25.423575914 +0000 UTC m=+2774.730484493" Jan 05 14:35:26 crc kubenswrapper[4740]: I0105 14:35:26.381314 4740 generic.go:334] "Generic (PLEG): container finished" podID="0a84d8e6-06d0-4645-b42f-a77963c58987" containerID="b1aac9321632dcb0da6836d6990acde216eb33ea59eae118acb90c3124783116" exitCode=0 Jan 05 14:35:26 crc kubenswrapper[4740]: I0105 14:35:26.381365 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" event={"ID":"0a84d8e6-06d0-4645-b42f-a77963c58987","Type":"ContainerDied","Data":"b1aac9321632dcb0da6836d6990acde216eb33ea59eae118acb90c3124783116"} Jan 05 14:35:26 crc kubenswrapper[4740]: I0105 14:35:26.385277 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerStarted","Data":"c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258"} Jan 05 14:35:27 crc kubenswrapper[4740]: I0105 14:35:27.431502 4740 generic.go:334] "Generic (PLEG): container finished" podID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerID="c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258" exitCode=0 Jan 05 14:35:27 crc kubenswrapper[4740]: I0105 14:35:27.432318 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerDied","Data":"c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258"} Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:27.999824 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104587 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-0\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104631 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-ssh-key\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104662 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r87dz\" (UniqueName: \"kubernetes.io/projected/0a84d8e6-06d0-4645-b42f-a77963c58987-kube-api-access-r87dz\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104799 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-inventory\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104866 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-1\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104888 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-0\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.104918 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-1\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.105049 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-combined-ca-bundle\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.105114 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-extra-config-0\") pod \"0a84d8e6-06d0-4645-b42f-a77963c58987\" (UID: \"0a84d8e6-06d0-4645-b42f-a77963c58987\") " Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.113956 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a84d8e6-06d0-4645-b42f-a77963c58987-kube-api-access-r87dz" (OuterVolumeSpecName: "kube-api-access-r87dz") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "kube-api-access-r87dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.116267 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.184058 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.184613 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.188652 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.189281 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.190660 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-inventory" (OuterVolumeSpecName: "inventory") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.204193 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207824 4740 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207852 4740 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207862 4740 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207871 4740 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207880 4740 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207890 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207901 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r87dz\" (UniqueName: \"kubernetes.io/projected/0a84d8e6-06d0-4645-b42f-a77963c58987-kube-api-access-r87dz\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.207929 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.210711 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0a84d8e6-06d0-4645-b42f-a77963c58987" (UID: "0a84d8e6-06d0-4645-b42f-a77963c58987"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.310182 4740 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0a84d8e6-06d0-4645-b42f-a77963c58987-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.447842 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" event={"ID":"0a84d8e6-06d0-4645-b42f-a77963c58987","Type":"ContainerDied","Data":"3e002026d58c49ee411412cd47b388076677a3f979c10f8707b8a60282b4055f"} Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.447889 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e002026d58c49ee411412cd47b388076677a3f979c10f8707b8a60282b4055f" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.447951 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xg2sb" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.544969 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv"] Jan 05 14:35:28 crc kubenswrapper[4740]: E0105 14:35:28.545880 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a84d8e6-06d0-4645-b42f-a77963c58987" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.545904 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a84d8e6-06d0-4645-b42f-a77963c58987" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.546201 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a84d8e6-06d0-4645-b42f-a77963c58987" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.547092 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.549276 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.549435 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.549921 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.549961 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.550708 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.559993 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv"] Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.721715 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vxqz\" (UniqueName: \"kubernetes.io/projected/f2e8ef4b-ba8e-46af-b20d-f19af317419c-kube-api-access-9vxqz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.721793 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.721900 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.721922 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.721947 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.722022 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.722061 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824583 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824641 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824671 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824725 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824758 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824859 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vxqz\" (UniqueName: \"kubernetes.io/projected/f2e8ef4b-ba8e-46af-b20d-f19af317419c-kube-api-access-9vxqz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.824885 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.829956 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.829973 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.830511 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.830564 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.830885 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.832000 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.843709 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vxqz\" (UniqueName: \"kubernetes.io/projected/f2e8ef4b-ba8e-46af-b20d-f19af317419c-kube-api-access-9vxqz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:28 crc kubenswrapper[4740]: I0105 14:35:28.873040 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:35:29 crc kubenswrapper[4740]: I0105 14:35:29.457836 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv"] Jan 05 14:35:29 crc kubenswrapper[4740]: W0105 14:35:29.477751 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2e8ef4b_ba8e_46af_b20d_f19af317419c.slice/crio-463f72bd20d3c9886f59ee38adedabb2d7a8e13d4bbc6ec7a61da72ab2fbd3ee WatchSource:0}: Error finding container 463f72bd20d3c9886f59ee38adedabb2d7a8e13d4bbc6ec7a61da72ab2fbd3ee: Status 404 returned error can't find the container with id 463f72bd20d3c9886f59ee38adedabb2d7a8e13d4bbc6ec7a61da72ab2fbd3ee Jan 05 14:35:29 crc kubenswrapper[4740]: I0105 14:35:29.478402 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerStarted","Data":"9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48"} Jan 05 14:35:29 crc kubenswrapper[4740]: I0105 14:35:29.515342 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5frhx" podStartSLOduration=3.634843794 podStartE2EDuration="6.515323814s" podCreationTimestamp="2026-01-05 14:35:23 +0000 UTC" firstStartedPulling="2026-01-05 14:35:25.368390721 +0000 UTC m=+2774.675299320" lastFinishedPulling="2026-01-05 14:35:28.248870761 +0000 UTC m=+2777.555779340" observedRunningTime="2026-01-05 14:35:29.506707313 +0000 UTC m=+2778.813615892" watchObservedRunningTime="2026-01-05 14:35:29.515323814 +0000 UTC m=+2778.822232393" Jan 05 14:35:29 crc kubenswrapper[4740]: I0105 14:35:29.583664 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:29 crc kubenswrapper[4740]: I0105 14:35:29.584026 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:30 crc kubenswrapper[4740]: I0105 14:35:30.490391 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" event={"ID":"f2e8ef4b-ba8e-46af-b20d-f19af317419c","Type":"ContainerStarted","Data":"463f72bd20d3c9886f59ee38adedabb2d7a8e13d4bbc6ec7a61da72ab2fbd3ee"} Jan 05 14:35:30 crc kubenswrapper[4740]: I0105 14:35:30.627241 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-249zz" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="registry-server" probeResult="failure" output=< Jan 05 14:35:30 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:35:30 crc kubenswrapper[4740]: > Jan 05 14:35:31 crc kubenswrapper[4740]: I0105 14:35:31.507919 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" event={"ID":"f2e8ef4b-ba8e-46af-b20d-f19af317419c","Type":"ContainerStarted","Data":"babff738f21c71f6744b95a10ffe4a4d486cc799e4bfb9feb43a88251a09a36f"} Jan 05 14:35:31 crc kubenswrapper[4740]: I0105 14:35:31.534983 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" podStartSLOduration=2.347353619 podStartE2EDuration="3.534959368s" podCreationTimestamp="2026-01-05 14:35:28 +0000 UTC" firstStartedPulling="2026-01-05 14:35:29.482750341 +0000 UTC m=+2778.789658920" lastFinishedPulling="2026-01-05 14:35:30.67035608 +0000 UTC m=+2779.977264669" observedRunningTime="2026-01-05 14:35:31.534917237 +0000 UTC m=+2780.841825816" watchObservedRunningTime="2026-01-05 14:35:31.534959368 +0000 UTC m=+2780.841867987" Jan 05 14:35:33 crc kubenswrapper[4740]: I0105 14:35:33.964956 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:33 crc kubenswrapper[4740]: I0105 14:35:33.965979 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:34 crc kubenswrapper[4740]: I0105 14:35:34.066202 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:34 crc kubenswrapper[4740]: I0105 14:35:34.643155 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:34 crc kubenswrapper[4740]: I0105 14:35:34.713946 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5frhx"] Jan 05 14:35:36 crc kubenswrapper[4740]: I0105 14:35:36.594757 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5frhx" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="registry-server" containerID="cri-o://9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48" gracePeriod=2 Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.157957 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.260572 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-utilities\") pod \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.260745 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-catalog-content\") pod \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.261406 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-utilities" (OuterVolumeSpecName: "utilities") pod "e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" (UID: "e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.262648 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthk5\" (UniqueName: \"kubernetes.io/projected/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-kube-api-access-tthk5\") pod \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\" (UID: \"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3\") " Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.266110 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.268596 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-kube-api-access-tthk5" (OuterVolumeSpecName: "kube-api-access-tthk5") pod "e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" (UID: "e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3"). InnerVolumeSpecName "kube-api-access-tthk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.296932 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" (UID: "e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.369265 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.369643 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthk5\" (UniqueName: \"kubernetes.io/projected/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3-kube-api-access-tthk5\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.612703 4740 generic.go:334] "Generic (PLEG): container finished" podID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerID="9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48" exitCode=0 Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.612746 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerDied","Data":"9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48"} Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.612795 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5frhx" event={"ID":"e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3","Type":"ContainerDied","Data":"dc7c5af059ffa8e7bde1cf0b609575bbad37b18e44e2a787f5e586e0b6b0ea31"} Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.612822 4740 scope.go:117] "RemoveContainer" containerID="9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.612820 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5frhx" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.665939 4740 scope.go:117] "RemoveContainer" containerID="c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.689264 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5frhx"] Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.705413 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5frhx"] Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.735755 4740 scope.go:117] "RemoveContainer" containerID="e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.781133 4740 scope.go:117] "RemoveContainer" containerID="9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48" Jan 05 14:35:37 crc kubenswrapper[4740]: E0105 14:35:37.781671 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48\": container with ID starting with 9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48 not found: ID does not exist" containerID="9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.781736 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48"} err="failed to get container status \"9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48\": rpc error: code = NotFound desc = could not find container \"9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48\": container with ID starting with 9238b62a5edfda40f30fc9f4b254802f771c74e44704b5f6998381547bf19b48 not found: ID does not exist" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.781787 4740 scope.go:117] "RemoveContainer" containerID="c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258" Jan 05 14:35:37 crc kubenswrapper[4740]: E0105 14:35:37.782134 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258\": container with ID starting with c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258 not found: ID does not exist" containerID="c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.782163 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258"} err="failed to get container status \"c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258\": rpc error: code = NotFound desc = could not find container \"c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258\": container with ID starting with c18c75504a4143bfce5a6b569c197d7c854ce89c48961147cf1571e741765258 not found: ID does not exist" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.782186 4740 scope.go:117] "RemoveContainer" containerID="e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da" Jan 05 14:35:37 crc kubenswrapper[4740]: E0105 14:35:37.783319 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da\": container with ID starting with e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da not found: ID does not exist" containerID="e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da" Jan 05 14:35:37 crc kubenswrapper[4740]: I0105 14:35:37.783364 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da"} err="failed to get container status \"e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da\": rpc error: code = NotFound desc = could not find container \"e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da\": container with ID starting with e2418cd4c1604ead481507d278d14a563713c48e3604b7d997b3df92e15db7da not found: ID does not exist" Jan 05 14:35:38 crc kubenswrapper[4740]: I0105 14:35:38.987550 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" path="/var/lib/kubelet/pods/e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3/volumes" Jan 05 14:35:39 crc kubenswrapper[4740]: I0105 14:35:39.651256 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:39 crc kubenswrapper[4740]: I0105 14:35:39.746956 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:40 crc kubenswrapper[4740]: I0105 14:35:40.743245 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-249zz"] Jan 05 14:35:41 crc kubenswrapper[4740]: I0105 14:35:41.674407 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-249zz" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="registry-server" containerID="cri-o://c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d" gracePeriod=2 Jan 05 14:35:41 crc kubenswrapper[4740]: E0105 14:35:41.911001 4740 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0abbb64c_21db_46dc_94d2_618e3bdff03f.slice/crio-c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d.scope\": RecentStats: unable to find data in memory cache]" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.234535 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.325721 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mngz6\" (UniqueName: \"kubernetes.io/projected/0abbb64c-21db-46dc-94d2-618e3bdff03f-kube-api-access-mngz6\") pod \"0abbb64c-21db-46dc-94d2-618e3bdff03f\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.325976 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-catalog-content\") pod \"0abbb64c-21db-46dc-94d2-618e3bdff03f\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.330579 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-utilities\") pod \"0abbb64c-21db-46dc-94d2-618e3bdff03f\" (UID: \"0abbb64c-21db-46dc-94d2-618e3bdff03f\") " Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.331426 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-utilities" (OuterVolumeSpecName: "utilities") pod "0abbb64c-21db-46dc-94d2-618e3bdff03f" (UID: "0abbb64c-21db-46dc-94d2-618e3bdff03f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.331930 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.333523 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abbb64c-21db-46dc-94d2-618e3bdff03f-kube-api-access-mngz6" (OuterVolumeSpecName: "kube-api-access-mngz6") pod "0abbb64c-21db-46dc-94d2-618e3bdff03f" (UID: "0abbb64c-21db-46dc-94d2-618e3bdff03f"). InnerVolumeSpecName "kube-api-access-mngz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.397532 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0abbb64c-21db-46dc-94d2-618e3bdff03f" (UID: "0abbb64c-21db-46dc-94d2-618e3bdff03f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.433641 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mngz6\" (UniqueName: \"kubernetes.io/projected/0abbb64c-21db-46dc-94d2-618e3bdff03f-kube-api-access-mngz6\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.433672 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abbb64c-21db-46dc-94d2-618e3bdff03f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.691561 4740 generic.go:334] "Generic (PLEG): container finished" podID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerID="c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d" exitCode=0 Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.691641 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-249zz" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.691658 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerDied","Data":"c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d"} Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.691966 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-249zz" event={"ID":"0abbb64c-21db-46dc-94d2-618e3bdff03f","Type":"ContainerDied","Data":"6d1a98bfdecf88192848917e91cc0132cecccb6f53009bd26109ad81e3f84b0a"} Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.692033 4740 scope.go:117] "RemoveContainer" containerID="c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.724261 4740 scope.go:117] "RemoveContainer" containerID="062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.743267 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-249zz"] Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.762105 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-249zz"] Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.763626 4740 scope.go:117] "RemoveContainer" containerID="c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.828403 4740 scope.go:117] "RemoveContainer" containerID="c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d" Jan 05 14:35:42 crc kubenswrapper[4740]: E0105 14:35:42.828921 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d\": container with ID starting with c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d not found: ID does not exist" containerID="c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.828977 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d"} err="failed to get container status \"c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d\": rpc error: code = NotFound desc = could not find container \"c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d\": container with ID starting with c6b2b9a30794f8dfda4c6f5524ed0188b0059d3158698825561514f90282ed3d not found: ID does not exist" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.829006 4740 scope.go:117] "RemoveContainer" containerID="062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0" Jan 05 14:35:42 crc kubenswrapper[4740]: E0105 14:35:42.829407 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0\": container with ID starting with 062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0 not found: ID does not exist" containerID="062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.829452 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0"} err="failed to get container status \"062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0\": rpc error: code = NotFound desc = could not find container \"062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0\": container with ID starting with 062c1bf2cae8cf3efa05b0cbd50fcdebb65d9d47262ec984ed68a856daf80be0 not found: ID does not exist" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.829481 4740 scope.go:117] "RemoveContainer" containerID="c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9" Jan 05 14:35:42 crc kubenswrapper[4740]: E0105 14:35:42.829962 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9\": container with ID starting with c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9 not found: ID does not exist" containerID="c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.829986 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9"} err="failed to get container status \"c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9\": rpc error: code = NotFound desc = could not find container \"c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9\": container with ID starting with c498bba0faa3114f6802d2ae9286098e9127ddb4d5d1e4828bbfe70e4bdefcb9 not found: ID does not exist" Jan 05 14:35:42 crc kubenswrapper[4740]: I0105 14:35:42.989013 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" path="/var/lib/kubelet/pods/0abbb64c-21db-46dc-94d2-618e3bdff03f/volumes" Jan 05 14:37:01 crc kubenswrapper[4740]: I0105 14:37:01.915933 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:37:01 crc kubenswrapper[4740]: I0105 14:37:01.916744 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:37:31 crc kubenswrapper[4740]: I0105 14:37:31.916509 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:37:31 crc kubenswrapper[4740]: I0105 14:37:31.917261 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:38:01 crc kubenswrapper[4740]: I0105 14:38:01.916172 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:38:01 crc kubenswrapper[4740]: I0105 14:38:01.916725 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:38:01 crc kubenswrapper[4740]: I0105 14:38:01.916781 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:38:01 crc kubenswrapper[4740]: I0105 14:38:01.917769 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97adcb1f23098a3ed5375417ea1f2f00fd3cab92f0c37ade5cce5e918ccb2aef"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:38:01 crc kubenswrapper[4740]: I0105 14:38:01.917826 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://97adcb1f23098a3ed5375417ea1f2f00fd3cab92f0c37ade5cce5e918ccb2aef" gracePeriod=600 Jan 05 14:38:02 crc kubenswrapper[4740]: I0105 14:38:02.867880 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="97adcb1f23098a3ed5375417ea1f2f00fd3cab92f0c37ade5cce5e918ccb2aef" exitCode=0 Jan 05 14:38:02 crc kubenswrapper[4740]: I0105 14:38:02.867925 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"97adcb1f23098a3ed5375417ea1f2f00fd3cab92f0c37ade5cce5e918ccb2aef"} Jan 05 14:38:02 crc kubenswrapper[4740]: I0105 14:38:02.868428 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246"} Jan 05 14:38:02 crc kubenswrapper[4740]: I0105 14:38:02.868453 4740 scope.go:117] "RemoveContainer" containerID="64f16806a82217d217a68578e3b85911968dab2cba14c25d4fafc6685dab265b" Jan 05 14:38:37 crc kubenswrapper[4740]: I0105 14:38:37.325330 4740 generic.go:334] "Generic (PLEG): container finished" podID="f2e8ef4b-ba8e-46af-b20d-f19af317419c" containerID="babff738f21c71f6744b95a10ffe4a4d486cc799e4bfb9feb43a88251a09a36f" exitCode=0 Jan 05 14:38:37 crc kubenswrapper[4740]: I0105 14:38:37.325457 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" event={"ID":"f2e8ef4b-ba8e-46af-b20d-f19af317419c","Type":"ContainerDied","Data":"babff738f21c71f6744b95a10ffe4a4d486cc799e4bfb9feb43a88251a09a36f"} Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.048205 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067542 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-0\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067643 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-2\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067741 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-1\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067763 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-telemetry-combined-ca-bundle\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067796 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vxqz\" (UniqueName: \"kubernetes.io/projected/f2e8ef4b-ba8e-46af-b20d-f19af317419c-kube-api-access-9vxqz\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067880 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ssh-key\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.067941 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-inventory\") pod \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\" (UID: \"f2e8ef4b-ba8e-46af-b20d-f19af317419c\") " Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.095541 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e8ef4b-ba8e-46af-b20d-f19af317419c-kube-api-access-9vxqz" (OuterVolumeSpecName: "kube-api-access-9vxqz") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "kube-api-access-9vxqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.110289 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.118290 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.128490 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.143599 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.149140 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-inventory" (OuterVolumeSpecName: "inventory") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.160464 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "f2e8ef4b-ba8e-46af-b20d-f19af317419c" (UID: "f2e8ef4b-ba8e-46af-b20d-f19af317419c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170727 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170761 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170774 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170784 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170793 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170802 4740 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e8ef4b-ba8e-46af-b20d-f19af317419c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.170812 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vxqz\" (UniqueName: \"kubernetes.io/projected/f2e8ef4b-ba8e-46af-b20d-f19af317419c-kube-api-access-9vxqz\") on node \"crc\" DevicePath \"\"" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.354632 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" event={"ID":"f2e8ef4b-ba8e-46af-b20d-f19af317419c","Type":"ContainerDied","Data":"463f72bd20d3c9886f59ee38adedabb2d7a8e13d4bbc6ec7a61da72ab2fbd3ee"} Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.354944 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463f72bd20d3c9886f59ee38adedabb2d7a8e13d4bbc6ec7a61da72ab2fbd3ee" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.354712 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.482468 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4"] Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.482945 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="extract-utilities" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.482966 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="extract-utilities" Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.482982 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="extract-content" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.482990 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="extract-content" Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.483008 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="extract-content" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483016 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="extract-content" Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.483034 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="extract-utilities" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483041 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="extract-utilities" Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.483056 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="registry-server" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483081 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="registry-server" Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.483091 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e8ef4b-ba8e-46af-b20d-f19af317419c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483098 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e8ef4b-ba8e-46af-b20d-f19af317419c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 05 14:38:39 crc kubenswrapper[4740]: E0105 14:38:39.483124 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="registry-server" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483129 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="registry-server" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483361 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e8ef4b-ba8e-46af-b20d-f19af317419c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483384 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0698bb2-ffbb-4f7c-a98d-1d11735ee4a3" containerName="registry-server" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.483403 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0abbb64c-21db-46dc-94d2-618e3bdff03f" containerName="registry-server" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.484162 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.487046 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.488210 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.489043 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.489158 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.489159 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.495044 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4"] Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.577483 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjpj\" (UniqueName: \"kubernetes.io/projected/6f1d62a2-396b-4081-a429-95ccbc8c8764-kube-api-access-wcjpj\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.577558 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.577641 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.577694 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.577939 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.578019 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.578090 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.680869 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjpj\" (UniqueName: \"kubernetes.io/projected/6f1d62a2-396b-4081-a429-95ccbc8c8764-kube-api-access-wcjpj\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.680946 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.681008 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.681063 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.681135 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.681164 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.681195 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.686211 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.686694 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.687961 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.688304 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.691162 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.691830 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.704445 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjpj\" (UniqueName: \"kubernetes.io/projected/6f1d62a2-396b-4081-a429-95ccbc8c8764-kube-api-access-wcjpj\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:39 crc kubenswrapper[4740]: I0105 14:38:39.803669 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:38:40 crc kubenswrapper[4740]: I0105 14:38:40.420098 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4"] Jan 05 14:38:41 crc kubenswrapper[4740]: I0105 14:38:41.377943 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" event={"ID":"6f1d62a2-396b-4081-a429-95ccbc8c8764","Type":"ContainerStarted","Data":"e1719ae168d1145e5d173436ae7cc15fef836f5a8857a4eeaf34285825e0c539"} Jan 05 14:38:41 crc kubenswrapper[4740]: I0105 14:38:41.378426 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" event={"ID":"6f1d62a2-396b-4081-a429-95ccbc8c8764","Type":"ContainerStarted","Data":"c6cb132cbb34c62170368da75c21d180b75d2d271f9d7ae424f267cae7e35a53"} Jan 05 14:38:41 crc kubenswrapper[4740]: I0105 14:38:41.417538 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" podStartSLOduration=1.887779447 podStartE2EDuration="2.417509979s" podCreationTimestamp="2026-01-05 14:38:39 +0000 UTC" firstStartedPulling="2026-01-05 14:38:40.424983964 +0000 UTC m=+2969.731892543" lastFinishedPulling="2026-01-05 14:38:40.954714486 +0000 UTC m=+2970.261623075" observedRunningTime="2026-01-05 14:38:41.406922863 +0000 UTC m=+2970.713831482" watchObservedRunningTime="2026-01-05 14:38:41.417509979 +0000 UTC m=+2970.724418598" Jan 05 14:40:31 crc kubenswrapper[4740]: I0105 14:40:31.916573 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:40:31 crc kubenswrapper[4740]: I0105 14:40:31.917527 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.477783 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t9mgv"] Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.481616 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.492560 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9mgv"] Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.547535 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckjm\" (UniqueName: \"kubernetes.io/projected/2f7de250-6a7b-41d2-a90d-a34e19830990-kube-api-access-gckjm\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.547736 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-utilities\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.548018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-catalog-content\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.650820 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckjm\" (UniqueName: \"kubernetes.io/projected/2f7de250-6a7b-41d2-a90d-a34e19830990-kube-api-access-gckjm\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.650886 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-utilities\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.650974 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-catalog-content\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.651388 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-utilities\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.651431 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-catalog-content\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.672063 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckjm\" (UniqueName: \"kubernetes.io/projected/2f7de250-6a7b-41d2-a90d-a34e19830990-kube-api-access-gckjm\") pod \"redhat-operators-t9mgv\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:48 crc kubenswrapper[4740]: I0105 14:40:48.807961 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:49 crc kubenswrapper[4740]: I0105 14:40:49.312508 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9mgv"] Jan 05 14:40:50 crc kubenswrapper[4740]: I0105 14:40:50.029498 4740 generic.go:334] "Generic (PLEG): container finished" podID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerID="96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592" exitCode=0 Jan 05 14:40:50 crc kubenswrapper[4740]: I0105 14:40:50.029558 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerDied","Data":"96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592"} Jan 05 14:40:50 crc kubenswrapper[4740]: I0105 14:40:50.029758 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerStarted","Data":"0acbb880bf4e90e8253e61b74182ad1a2b5b2d462e3cff67506add1eeb04c38e"} Jan 05 14:40:50 crc kubenswrapper[4740]: I0105 14:40:50.032021 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:40:51 crc kubenswrapper[4740]: I0105 14:40:51.049402 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerStarted","Data":"d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0"} Jan 05 14:40:55 crc kubenswrapper[4740]: I0105 14:40:55.108546 4740 generic.go:334] "Generic (PLEG): container finished" podID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerID="d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0" exitCode=0 Jan 05 14:40:55 crc kubenswrapper[4740]: I0105 14:40:55.108670 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerDied","Data":"d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0"} Jan 05 14:40:56 crc kubenswrapper[4740]: I0105 14:40:56.127135 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerStarted","Data":"d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761"} Jan 05 14:40:56 crc kubenswrapper[4740]: I0105 14:40:56.162035 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t9mgv" podStartSLOduration=2.640957682 podStartE2EDuration="8.162013078s" podCreationTimestamp="2026-01-05 14:40:48 +0000 UTC" firstStartedPulling="2026-01-05 14:40:50.031731196 +0000 UTC m=+3099.338639785" lastFinishedPulling="2026-01-05 14:40:55.552786602 +0000 UTC m=+3104.859695181" observedRunningTime="2026-01-05 14:40:56.15243746 +0000 UTC m=+3105.459346049" watchObservedRunningTime="2026-01-05 14:40:56.162013078 +0000 UTC m=+3105.468921667" Jan 05 14:40:58 crc kubenswrapper[4740]: I0105 14:40:58.808644 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:58 crc kubenswrapper[4740]: I0105 14:40:58.808960 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:40:59 crc kubenswrapper[4740]: I0105 14:40:59.888724 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t9mgv" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="registry-server" probeResult="failure" output=< Jan 05 14:40:59 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:40:59 crc kubenswrapper[4740]: > Jan 05 14:41:01 crc kubenswrapper[4740]: I0105 14:41:01.916471 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:41:01 crc kubenswrapper[4740]: I0105 14:41:01.916549 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:41:08 crc kubenswrapper[4740]: I0105 14:41:08.879099 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:41:08 crc kubenswrapper[4740]: I0105 14:41:08.953328 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:41:09 crc kubenswrapper[4740]: I0105 14:41:09.132892 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9mgv"] Jan 05 14:41:10 crc kubenswrapper[4740]: I0105 14:41:10.329420 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t9mgv" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="registry-server" containerID="cri-o://d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761" gracePeriod=2 Jan 05 14:41:10 crc kubenswrapper[4740]: I0105 14:41:10.965524 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.088592 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-utilities\") pod \"2f7de250-6a7b-41d2-a90d-a34e19830990\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.088931 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckjm\" (UniqueName: \"kubernetes.io/projected/2f7de250-6a7b-41d2-a90d-a34e19830990-kube-api-access-gckjm\") pod \"2f7de250-6a7b-41d2-a90d-a34e19830990\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.089349 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-catalog-content\") pod \"2f7de250-6a7b-41d2-a90d-a34e19830990\" (UID: \"2f7de250-6a7b-41d2-a90d-a34e19830990\") " Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.089667 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-utilities" (OuterVolumeSpecName: "utilities") pod "2f7de250-6a7b-41d2-a90d-a34e19830990" (UID: "2f7de250-6a7b-41d2-a90d-a34e19830990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.090973 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.094526 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7de250-6a7b-41d2-a90d-a34e19830990-kube-api-access-gckjm" (OuterVolumeSpecName: "kube-api-access-gckjm") pod "2f7de250-6a7b-41d2-a90d-a34e19830990" (UID: "2f7de250-6a7b-41d2-a90d-a34e19830990"). InnerVolumeSpecName "kube-api-access-gckjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.193516 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckjm\" (UniqueName: \"kubernetes.io/projected/2f7de250-6a7b-41d2-a90d-a34e19830990-kube-api-access-gckjm\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.212821 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f7de250-6a7b-41d2-a90d-a34e19830990" (UID: "2f7de250-6a7b-41d2-a90d-a34e19830990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.295773 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7de250-6a7b-41d2-a90d-a34e19830990-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.340644 4740 generic.go:334] "Generic (PLEG): container finished" podID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerID="d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761" exitCode=0 Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.340685 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerDied","Data":"d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761"} Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.340714 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9mgv" event={"ID":"2f7de250-6a7b-41d2-a90d-a34e19830990","Type":"ContainerDied","Data":"0acbb880bf4e90e8253e61b74182ad1a2b5b2d462e3cff67506add1eeb04c38e"} Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.340729 4740 scope.go:117] "RemoveContainer" containerID="d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.340759 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9mgv" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.384458 4740 scope.go:117] "RemoveContainer" containerID="d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.385921 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9mgv"] Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.396619 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t9mgv"] Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.410019 4740 scope.go:117] "RemoveContainer" containerID="96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.456208 4740 scope.go:117] "RemoveContainer" containerID="d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761" Jan 05 14:41:11 crc kubenswrapper[4740]: E0105 14:41:11.456626 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761\": container with ID starting with d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761 not found: ID does not exist" containerID="d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.456663 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761"} err="failed to get container status \"d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761\": rpc error: code = NotFound desc = could not find container \"d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761\": container with ID starting with d072ce79e5d02a53552db86a233e9c92d5b1b010cc35f2f79ebe98d6a92c6761 not found: ID does not exist" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.456688 4740 scope.go:117] "RemoveContainer" containerID="d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0" Jan 05 14:41:11 crc kubenswrapper[4740]: E0105 14:41:11.457191 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0\": container with ID starting with d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0 not found: ID does not exist" containerID="d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.457249 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0"} err="failed to get container status \"d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0\": rpc error: code = NotFound desc = could not find container \"d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0\": container with ID starting with d6e74b0b48445f44127dd4e2f91a1904bb3fa0f5fe22a71c968ae8f3ffb579b0 not found: ID does not exist" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.457295 4740 scope.go:117] "RemoveContainer" containerID="96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592" Jan 05 14:41:11 crc kubenswrapper[4740]: E0105 14:41:11.457923 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592\": container with ID starting with 96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592 not found: ID does not exist" containerID="96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592" Jan 05 14:41:11 crc kubenswrapper[4740]: I0105 14:41:11.457984 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592"} err="failed to get container status \"96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592\": rpc error: code = NotFound desc = could not find container \"96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592\": container with ID starting with 96ae9cd31ad7ebc514dac7d35e7a77367651790df740860f1aa12eeaf0f31592 not found: ID does not exist" Jan 05 14:41:12 crc kubenswrapper[4740]: I0105 14:41:12.997026 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" path="/var/lib/kubelet/pods/2f7de250-6a7b-41d2-a90d-a34e19830990/volumes" Jan 05 14:41:15 crc kubenswrapper[4740]: I0105 14:41:15.399504 4740 generic.go:334] "Generic (PLEG): container finished" podID="6f1d62a2-396b-4081-a429-95ccbc8c8764" containerID="e1719ae168d1145e5d173436ae7cc15fef836f5a8857a4eeaf34285825e0c539" exitCode=0 Jan 05 14:41:15 crc kubenswrapper[4740]: I0105 14:41:15.399641 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" event={"ID":"6f1d62a2-396b-4081-a429-95ccbc8c8764","Type":"ContainerDied","Data":"e1719ae168d1145e5d173436ae7cc15fef836f5a8857a4eeaf34285825e0c539"} Jan 05 14:41:16 crc kubenswrapper[4740]: I0105 14:41:16.945799 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.060921 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-telemetry-power-monitoring-combined-ca-bundle\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.061033 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ssh-key\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.061113 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-inventory\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.061179 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-1\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.061256 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcjpj\" (UniqueName: \"kubernetes.io/projected/6f1d62a2-396b-4081-a429-95ccbc8c8764-kube-api-access-wcjpj\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.061345 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-0\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.061364 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-2\") pod \"6f1d62a2-396b-4081-a429-95ccbc8c8764\" (UID: \"6f1d62a2-396b-4081-a429-95ccbc8c8764\") " Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.067656 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.069397 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1d62a2-396b-4081-a429-95ccbc8c8764-kube-api-access-wcjpj" (OuterVolumeSpecName: "kube-api-access-wcjpj") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "kube-api-access-wcjpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.105973 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.106805 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.107671 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-inventory" (OuterVolumeSpecName: "inventory") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.123283 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.134714 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "6f1d62a2-396b-4081-a429-95ccbc8c8764" (UID: "6f1d62a2-396b-4081-a429-95ccbc8c8764"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165364 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165415 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165440 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165462 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcjpj\" (UniqueName: \"kubernetes.io/projected/6f1d62a2-396b-4081-a429-95ccbc8c8764-kube-api-access-wcjpj\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165487 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165506 4740 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.165529 4740 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1d62a2-396b-4081-a429-95ccbc8c8764-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.430707 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" event={"ID":"6f1d62a2-396b-4081-a429-95ccbc8c8764","Type":"ContainerDied","Data":"c6cb132cbb34c62170368da75c21d180b75d2d271f9d7ae424f267cae7e35a53"} Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.431390 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6cb132cbb34c62170368da75c21d180b75d2d271f9d7ae424f267cae7e35a53" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.430799 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.569948 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg"] Jan 05 14:41:17 crc kubenswrapper[4740]: E0105 14:41:17.570572 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1d62a2-396b-4081-a429-95ccbc8c8764" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.570594 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1d62a2-396b-4081-a429-95ccbc8c8764" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Jan 05 14:41:17 crc kubenswrapper[4740]: E0105 14:41:17.570634 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="registry-server" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.570644 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="registry-server" Jan 05 14:41:17 crc kubenswrapper[4740]: E0105 14:41:17.570671 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="extract-content" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.570680 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="extract-content" Jan 05 14:41:17 crc kubenswrapper[4740]: E0105 14:41:17.570702 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="extract-utilities" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.570710 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="extract-utilities" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.571041 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1d62a2-396b-4081-a429-95ccbc8c8764" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.571114 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7de250-6a7b-41d2-a90d-a34e19830990" containerName="registry-server" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.572184 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.575190 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.578269 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.578526 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftsr9" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.578752 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.579854 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.585010 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg"] Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.678479 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944pj\" (UniqueName: \"kubernetes.io/projected/f6735580-cec3-4c58-82ed-37c1b38ba74c-kube-api-access-944pj\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.678554 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.678972 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.679024 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.679222 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.781060 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-944pj\" (UniqueName: \"kubernetes.io/projected/f6735580-cec3-4c58-82ed-37c1b38ba74c-kube-api-access-944pj\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.781214 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.781418 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.781453 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.781549 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.787688 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.787983 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.788113 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.788323 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.797418 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-944pj\" (UniqueName: \"kubernetes.io/projected/f6735580-cec3-4c58-82ed-37c1b38ba74c-kube-api-access-944pj\") pod \"logging-edpm-deployment-openstack-edpm-ipam-k56qg\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:17 crc kubenswrapper[4740]: I0105 14:41:17.915847 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:18 crc kubenswrapper[4740]: I0105 14:41:18.609418 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg"] Jan 05 14:41:19 crc kubenswrapper[4740]: I0105 14:41:19.477571 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" event={"ID":"f6735580-cec3-4c58-82ed-37c1b38ba74c","Type":"ContainerStarted","Data":"e2241cbcb47a35aed172c083732feb7b72dff6260a7baddcd7871178c83ccab3"} Jan 05 14:41:19 crc kubenswrapper[4740]: I0105 14:41:19.478512 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" event={"ID":"f6735580-cec3-4c58-82ed-37c1b38ba74c","Type":"ContainerStarted","Data":"9623f2f0e31572924e0d8d5e94e97472a9a7176c699ed59c6e51cb91c862031e"} Jan 05 14:41:19 crc kubenswrapper[4740]: I0105 14:41:19.502881 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" podStartSLOduration=1.9601323389999998 podStartE2EDuration="2.502861591s" podCreationTimestamp="2026-01-05 14:41:17 +0000 UTC" firstStartedPulling="2026-01-05 14:41:18.618916658 +0000 UTC m=+3127.925825277" lastFinishedPulling="2026-01-05 14:41:19.16164595 +0000 UTC m=+3128.468554529" observedRunningTime="2026-01-05 14:41:19.490772405 +0000 UTC m=+3128.797681004" watchObservedRunningTime="2026-01-05 14:41:19.502861591 +0000 UTC m=+3128.809770170" Jan 05 14:41:31 crc kubenswrapper[4740]: I0105 14:41:31.916352 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:41:31 crc kubenswrapper[4740]: I0105 14:41:31.917013 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:41:31 crc kubenswrapper[4740]: I0105 14:41:31.917113 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:41:31 crc kubenswrapper[4740]: I0105 14:41:31.918311 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:41:31 crc kubenswrapper[4740]: I0105 14:41:31.918417 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" gracePeriod=600 Jan 05 14:41:32 crc kubenswrapper[4740]: E0105 14:41:32.047473 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:41:32 crc kubenswrapper[4740]: I0105 14:41:32.673916 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" exitCode=0 Jan 05 14:41:32 crc kubenswrapper[4740]: I0105 14:41:32.674009 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246"} Jan 05 14:41:32 crc kubenswrapper[4740]: I0105 14:41:32.675437 4740 scope.go:117] "RemoveContainer" containerID="97adcb1f23098a3ed5375417ea1f2f00fd3cab92f0c37ade5cce5e918ccb2aef" Jan 05 14:41:32 crc kubenswrapper[4740]: I0105 14:41:32.676812 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:41:32 crc kubenswrapper[4740]: E0105 14:41:32.677330 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:41:37 crc kubenswrapper[4740]: I0105 14:41:37.762775 4740 generic.go:334] "Generic (PLEG): container finished" podID="f6735580-cec3-4c58-82ed-37c1b38ba74c" containerID="e2241cbcb47a35aed172c083732feb7b72dff6260a7baddcd7871178c83ccab3" exitCode=0 Jan 05 14:41:37 crc kubenswrapper[4740]: I0105 14:41:37.762810 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" event={"ID":"f6735580-cec3-4c58-82ed-37c1b38ba74c","Type":"ContainerDied","Data":"e2241cbcb47a35aed172c083732feb7b72dff6260a7baddcd7871178c83ccab3"} Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.268888 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.355977 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-1\") pod \"f6735580-cec3-4c58-82ed-37c1b38ba74c\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.356609 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-944pj\" (UniqueName: \"kubernetes.io/projected/f6735580-cec3-4c58-82ed-37c1b38ba74c-kube-api-access-944pj\") pod \"f6735580-cec3-4c58-82ed-37c1b38ba74c\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.357230 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-ssh-key\") pod \"f6735580-cec3-4c58-82ed-37c1b38ba74c\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.357928 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-0\") pod \"f6735580-cec3-4c58-82ed-37c1b38ba74c\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.358029 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-inventory\") pod \"f6735580-cec3-4c58-82ed-37c1b38ba74c\" (UID: \"f6735580-cec3-4c58-82ed-37c1b38ba74c\") " Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.387135 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6735580-cec3-4c58-82ed-37c1b38ba74c-kube-api-access-944pj" (OuterVolumeSpecName: "kube-api-access-944pj") pod "f6735580-cec3-4c58-82ed-37c1b38ba74c" (UID: "f6735580-cec3-4c58-82ed-37c1b38ba74c"). InnerVolumeSpecName "kube-api-access-944pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.399312 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "f6735580-cec3-4c58-82ed-37c1b38ba74c" (UID: "f6735580-cec3-4c58-82ed-37c1b38ba74c"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.405182 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "f6735580-cec3-4c58-82ed-37c1b38ba74c" (UID: "f6735580-cec3-4c58-82ed-37c1b38ba74c"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.411362 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-inventory" (OuterVolumeSpecName: "inventory") pod "f6735580-cec3-4c58-82ed-37c1b38ba74c" (UID: "f6735580-cec3-4c58-82ed-37c1b38ba74c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.417574 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6735580-cec3-4c58-82ed-37c1b38ba74c" (UID: "f6735580-cec3-4c58-82ed-37c1b38ba74c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.464682 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.464913 4740 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.465037 4740 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-inventory\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.465139 4740 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f6735580-cec3-4c58-82ed-37c1b38ba74c-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.465221 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-944pj\" (UniqueName: \"kubernetes.io/projected/f6735580-cec3-4c58-82ed-37c1b38ba74c-kube-api-access-944pj\") on node \"crc\" DevicePath \"\"" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.795271 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" event={"ID":"f6735580-cec3-4c58-82ed-37c1b38ba74c","Type":"ContainerDied","Data":"9623f2f0e31572924e0d8d5e94e97472a9a7176c699ed59c6e51cb91c862031e"} Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.795684 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9623f2f0e31572924e0d8d5e94e97472a9a7176c699ed59c6e51cb91c862031e" Jan 05 14:41:39 crc kubenswrapper[4740]: I0105 14:41:39.795340 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-k56qg" Jan 05 14:41:46 crc kubenswrapper[4740]: I0105 14:41:46.969421 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:41:46 crc kubenswrapper[4740]: E0105 14:41:46.970680 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:41:57 crc kubenswrapper[4740]: I0105 14:41:57.968654 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:41:57 crc kubenswrapper[4740]: E0105 14:41:57.969474 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:42:09 crc kubenswrapper[4740]: I0105 14:42:09.968494 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:42:09 crc kubenswrapper[4740]: E0105 14:42:09.970670 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:42:21 crc kubenswrapper[4740]: I0105 14:42:21.969562 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:42:21 crc kubenswrapper[4740]: E0105 14:42:21.971148 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:42:33 crc kubenswrapper[4740]: E0105 14:42:33.551537 4740 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.97:59192->38.102.83.97:45101: write tcp 38.102.83.97:59192->38.102.83.97:45101: write: broken pipe Jan 05 14:42:36 crc kubenswrapper[4740]: I0105 14:42:36.969059 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:42:36 crc kubenswrapper[4740]: E0105 14:42:36.970375 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:42:50 crc kubenswrapper[4740]: I0105 14:42:50.968049 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:42:50 crc kubenswrapper[4740]: E0105 14:42:50.968752 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:43:02 crc kubenswrapper[4740]: I0105 14:43:02.967958 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:43:02 crc kubenswrapper[4740]: E0105 14:43:02.968693 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.318616 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vmlxv"] Jan 05 14:43:04 crc kubenswrapper[4740]: E0105 14:43:04.319901 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6735580-cec3-4c58-82ed-37c1b38ba74c" containerName="logging-edpm-deployment-openstack-edpm-ipam" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.319933 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6735580-cec3-4c58-82ed-37c1b38ba74c" containerName="logging-edpm-deployment-openstack-edpm-ipam" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.320433 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6735580-cec3-4c58-82ed-37c1b38ba74c" containerName="logging-edpm-deployment-openstack-edpm-ipam" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.323824 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.342998 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmlxv"] Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.507907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-utilities\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.508521 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbwdw\" (UniqueName: \"kubernetes.io/projected/cd776c90-77d9-43f1-ac55-0125ff70cf9c-kube-api-access-dbwdw\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.508595 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-catalog-content\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.611108 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwdw\" (UniqueName: \"kubernetes.io/projected/cd776c90-77d9-43f1-ac55-0125ff70cf9c-kube-api-access-dbwdw\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.611157 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-catalog-content\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.611259 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-utilities\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.612191 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-catalog-content\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.612513 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-utilities\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.630652 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbwdw\" (UniqueName: \"kubernetes.io/projected/cd776c90-77d9-43f1-ac55-0125ff70cf9c-kube-api-access-dbwdw\") pod \"certified-operators-vmlxv\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:04 crc kubenswrapper[4740]: I0105 14:43:04.657631 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:05 crc kubenswrapper[4740]: I0105 14:43:05.215157 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmlxv"] Jan 05 14:43:05 crc kubenswrapper[4740]: I0105 14:43:05.614150 4740 generic.go:334] "Generic (PLEG): container finished" podID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerID="21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36" exitCode=0 Jan 05 14:43:05 crc kubenswrapper[4740]: I0105 14:43:05.614201 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerDied","Data":"21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36"} Jan 05 14:43:05 crc kubenswrapper[4740]: I0105 14:43:05.614412 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerStarted","Data":"73ed24bdf02ce4893decd749aee1b677540a7576f92c1f10c303e223bcb46d52"} Jan 05 14:43:06 crc kubenswrapper[4740]: I0105 14:43:06.628375 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerStarted","Data":"4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8"} Jan 05 14:43:07 crc kubenswrapper[4740]: I0105 14:43:07.646114 4740 generic.go:334] "Generic (PLEG): container finished" podID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerID="4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8" exitCode=0 Jan 05 14:43:07 crc kubenswrapper[4740]: I0105 14:43:07.646336 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerDied","Data":"4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8"} Jan 05 14:43:08 crc kubenswrapper[4740]: I0105 14:43:08.665886 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerStarted","Data":"e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d"} Jan 05 14:43:08 crc kubenswrapper[4740]: I0105 14:43:08.724263 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vmlxv" podStartSLOduration=2.050396924 podStartE2EDuration="4.724226691s" podCreationTimestamp="2026-01-05 14:43:04 +0000 UTC" firstStartedPulling="2026-01-05 14:43:05.617019234 +0000 UTC m=+3234.923927853" lastFinishedPulling="2026-01-05 14:43:08.290849041 +0000 UTC m=+3237.597757620" observedRunningTime="2026-01-05 14:43:08.68104618 +0000 UTC m=+3237.987954749" watchObservedRunningTime="2026-01-05 14:43:08.724226691 +0000 UTC m=+3238.031135330" Jan 05 14:43:14 crc kubenswrapper[4740]: I0105 14:43:14.658826 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:14 crc kubenswrapper[4740]: I0105 14:43:14.659509 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:14 crc kubenswrapper[4740]: I0105 14:43:14.748435 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:14 crc kubenswrapper[4740]: I0105 14:43:14.839721 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:14 crc kubenswrapper[4740]: I0105 14:43:14.968840 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:43:14 crc kubenswrapper[4740]: E0105 14:43:14.969258 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:43:14 crc kubenswrapper[4740]: I0105 14:43:14.991600 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmlxv"] Jan 05 14:43:16 crc kubenswrapper[4740]: I0105 14:43:16.768748 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vmlxv" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="registry-server" containerID="cri-o://e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d" gracePeriod=2 Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.420589 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.585090 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbwdw\" (UniqueName: \"kubernetes.io/projected/cd776c90-77d9-43f1-ac55-0125ff70cf9c-kube-api-access-dbwdw\") pod \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.585372 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-utilities\") pod \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.585508 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-catalog-content\") pod \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\" (UID: \"cd776c90-77d9-43f1-ac55-0125ff70cf9c\") " Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.586706 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-utilities" (OuterVolumeSpecName: "utilities") pod "cd776c90-77d9-43f1-ac55-0125ff70cf9c" (UID: "cd776c90-77d9-43f1-ac55-0125ff70cf9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.590324 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd776c90-77d9-43f1-ac55-0125ff70cf9c-kube-api-access-dbwdw" (OuterVolumeSpecName: "kube-api-access-dbwdw") pod "cd776c90-77d9-43f1-ac55-0125ff70cf9c" (UID: "cd776c90-77d9-43f1-ac55-0125ff70cf9c"). InnerVolumeSpecName "kube-api-access-dbwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.636732 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd776c90-77d9-43f1-ac55-0125ff70cf9c" (UID: "cd776c90-77d9-43f1-ac55-0125ff70cf9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.689450 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.689507 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd776c90-77d9-43f1-ac55-0125ff70cf9c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.689528 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbwdw\" (UniqueName: \"kubernetes.io/projected/cd776c90-77d9-43f1-ac55-0125ff70cf9c-kube-api-access-dbwdw\") on node \"crc\" DevicePath \"\"" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.780664 4740 generic.go:334] "Generic (PLEG): container finished" podID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerID="e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d" exitCode=0 Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.780726 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerDied","Data":"e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d"} Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.780776 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmlxv" event={"ID":"cd776c90-77d9-43f1-ac55-0125ff70cf9c","Type":"ContainerDied","Data":"73ed24bdf02ce4893decd749aee1b677540a7576f92c1f10c303e223bcb46d52"} Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.780804 4740 scope.go:117] "RemoveContainer" containerID="e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.780794 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmlxv" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.808030 4740 scope.go:117] "RemoveContainer" containerID="4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.836475 4740 scope.go:117] "RemoveContainer" containerID="21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.837331 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmlxv"] Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.850101 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vmlxv"] Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.905882 4740 scope.go:117] "RemoveContainer" containerID="e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d" Jan 05 14:43:17 crc kubenswrapper[4740]: E0105 14:43:17.906321 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d\": container with ID starting with e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d not found: ID does not exist" containerID="e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.906355 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d"} err="failed to get container status \"e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d\": rpc error: code = NotFound desc = could not find container \"e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d\": container with ID starting with e1bab6d1ff099c66caf4f3b8303294974591133e7da3d58f36e61039cf95e27d not found: ID does not exist" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.906373 4740 scope.go:117] "RemoveContainer" containerID="4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8" Jan 05 14:43:17 crc kubenswrapper[4740]: E0105 14:43:17.906667 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8\": container with ID starting with 4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8 not found: ID does not exist" containerID="4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.906710 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8"} err="failed to get container status \"4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8\": rpc error: code = NotFound desc = could not find container \"4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8\": container with ID starting with 4d4d4e3cfc09b3fc873823b5d04b6001fc7a0712180b80ab27fd17921f4004f8 not found: ID does not exist" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.906738 4740 scope.go:117] "RemoveContainer" containerID="21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36" Jan 05 14:43:17 crc kubenswrapper[4740]: E0105 14:43:17.907029 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36\": container with ID starting with 21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36 not found: ID does not exist" containerID="21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36" Jan 05 14:43:17 crc kubenswrapper[4740]: I0105 14:43:17.907075 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36"} err="failed to get container status \"21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36\": rpc error: code = NotFound desc = could not find container \"21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36\": container with ID starting with 21c026426d933fe925ea4e39eb664b54624a3d5b7515666ae70a295f02bddd36 not found: ID does not exist" Jan 05 14:43:18 crc kubenswrapper[4740]: I0105 14:43:18.983826 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" path="/var/lib/kubelet/pods/cd776c90-77d9-43f1-ac55-0125ff70cf9c/volumes" Jan 05 14:43:26 crc kubenswrapper[4740]: I0105 14:43:26.969285 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:43:26 crc kubenswrapper[4740]: E0105 14:43:26.970094 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:43:39 crc kubenswrapper[4740]: I0105 14:43:39.969274 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:43:39 crc kubenswrapper[4740]: E0105 14:43:39.970556 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:43:50 crc kubenswrapper[4740]: I0105 14:43:50.984669 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:43:50 crc kubenswrapper[4740]: E0105 14:43:50.985939 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:44:03 crc kubenswrapper[4740]: I0105 14:44:03.969250 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:44:03 crc kubenswrapper[4740]: E0105 14:44:03.972135 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:44:06 crc kubenswrapper[4740]: E0105 14:44:06.709213 4740 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.97:40980->38.102.83.97:45101: read tcp 38.102.83.97:40980->38.102.83.97:45101: read: connection reset by peer Jan 05 14:44:18 crc kubenswrapper[4740]: I0105 14:44:18.969100 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:44:18 crc kubenswrapper[4740]: E0105 14:44:18.970752 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:44:31 crc kubenswrapper[4740]: I0105 14:44:31.969384 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:44:31 crc kubenswrapper[4740]: E0105 14:44:31.970335 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:44:44 crc kubenswrapper[4740]: I0105 14:44:44.969282 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:44:44 crc kubenswrapper[4740]: E0105 14:44:44.970205 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:44:55 crc kubenswrapper[4740]: I0105 14:44:55.968945 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:44:55 crc kubenswrapper[4740]: E0105 14:44:55.969972 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.159242 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf"] Jan 05 14:45:00 crc kubenswrapper[4740]: E0105 14:45:00.160209 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="extract-utilities" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.160227 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="extract-utilities" Jan 05 14:45:00 crc kubenswrapper[4740]: E0105 14:45:00.160273 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="registry-server" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.160281 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="registry-server" Jan 05 14:45:00 crc kubenswrapper[4740]: E0105 14:45:00.160307 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="extract-content" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.160312 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="extract-content" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.160522 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd776c90-77d9-43f1-ac55-0125ff70cf9c" containerName="registry-server" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.161369 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.165376 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.165851 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.178326 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf"] Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.201895 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64nhr\" (UniqueName: \"kubernetes.io/projected/39795bb0-ba19-4550-9b49-541fa6e61e24-kube-api-access-64nhr\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.202335 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39795bb0-ba19-4550-9b49-541fa6e61e24-secret-volume\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.202570 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39795bb0-ba19-4550-9b49-541fa6e61e24-config-volume\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.305482 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39795bb0-ba19-4550-9b49-541fa6e61e24-config-volume\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.305671 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64nhr\" (UniqueName: \"kubernetes.io/projected/39795bb0-ba19-4550-9b49-541fa6e61e24-kube-api-access-64nhr\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.306157 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39795bb0-ba19-4550-9b49-541fa6e61e24-secret-volume\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.306921 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39795bb0-ba19-4550-9b49-541fa6e61e24-config-volume\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.319500 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39795bb0-ba19-4550-9b49-541fa6e61e24-secret-volume\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.325531 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64nhr\" (UniqueName: \"kubernetes.io/projected/39795bb0-ba19-4550-9b49-541fa6e61e24-kube-api-access-64nhr\") pod \"collect-profiles-29460405-qxplf\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:00 crc kubenswrapper[4740]: I0105 14:45:00.504727 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:01 crc kubenswrapper[4740]: I0105 14:45:01.029664 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf"] Jan 05 14:45:01 crc kubenswrapper[4740]: I0105 14:45:01.333350 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" event={"ID":"39795bb0-ba19-4550-9b49-541fa6e61e24","Type":"ContainerStarted","Data":"bc174763d8a529e1249b358115527b28da018c9284b6a1c1e6910deb7984cf56"} Jan 05 14:45:01 crc kubenswrapper[4740]: I0105 14:45:01.333570 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" event={"ID":"39795bb0-ba19-4550-9b49-541fa6e61e24","Type":"ContainerStarted","Data":"ee4acab8c4ea700e567e950ebf4148155af35dbb9e6ec7da8237406fd414c327"} Jan 05 14:45:01 crc kubenswrapper[4740]: I0105 14:45:01.350787 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" podStartSLOduration=1.350772274 podStartE2EDuration="1.350772274s" podCreationTimestamp="2026-01-05 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 14:45:01.347983698 +0000 UTC m=+3350.654892277" watchObservedRunningTime="2026-01-05 14:45:01.350772274 +0000 UTC m=+3350.657680853" Jan 05 14:45:02 crc kubenswrapper[4740]: I0105 14:45:02.350616 4740 generic.go:334] "Generic (PLEG): container finished" podID="39795bb0-ba19-4550-9b49-541fa6e61e24" containerID="bc174763d8a529e1249b358115527b28da018c9284b6a1c1e6910deb7984cf56" exitCode=0 Jan 05 14:45:02 crc kubenswrapper[4740]: I0105 14:45:02.350723 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" event={"ID":"39795bb0-ba19-4550-9b49-541fa6e61e24","Type":"ContainerDied","Data":"bc174763d8a529e1249b358115527b28da018c9284b6a1c1e6910deb7984cf56"} Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.804444 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.896362 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39795bb0-ba19-4550-9b49-541fa6e61e24-config-volume\") pod \"39795bb0-ba19-4550-9b49-541fa6e61e24\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.896516 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64nhr\" (UniqueName: \"kubernetes.io/projected/39795bb0-ba19-4550-9b49-541fa6e61e24-kube-api-access-64nhr\") pod \"39795bb0-ba19-4550-9b49-541fa6e61e24\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.896851 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39795bb0-ba19-4550-9b49-541fa6e61e24-secret-volume\") pod \"39795bb0-ba19-4550-9b49-541fa6e61e24\" (UID: \"39795bb0-ba19-4550-9b49-541fa6e61e24\") " Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.897321 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39795bb0-ba19-4550-9b49-541fa6e61e24-config-volume" (OuterVolumeSpecName: "config-volume") pod "39795bb0-ba19-4550-9b49-541fa6e61e24" (UID: "39795bb0-ba19-4550-9b49-541fa6e61e24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.898822 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39795bb0-ba19-4550-9b49-541fa6e61e24-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.904001 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39795bb0-ba19-4550-9b49-541fa6e61e24-kube-api-access-64nhr" (OuterVolumeSpecName: "kube-api-access-64nhr") pod "39795bb0-ba19-4550-9b49-541fa6e61e24" (UID: "39795bb0-ba19-4550-9b49-541fa6e61e24"). InnerVolumeSpecName "kube-api-access-64nhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:45:03 crc kubenswrapper[4740]: I0105 14:45:03.909315 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39795bb0-ba19-4550-9b49-541fa6e61e24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39795bb0-ba19-4550-9b49-541fa6e61e24" (UID: "39795bb0-ba19-4550-9b49-541fa6e61e24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.004691 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39795bb0-ba19-4550-9b49-541fa6e61e24-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.004740 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64nhr\" (UniqueName: \"kubernetes.io/projected/39795bb0-ba19-4550-9b49-541fa6e61e24-kube-api-access-64nhr\") on node \"crc\" DevicePath \"\"" Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.374973 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" event={"ID":"39795bb0-ba19-4550-9b49-541fa6e61e24","Type":"ContainerDied","Data":"ee4acab8c4ea700e567e950ebf4148155af35dbb9e6ec7da8237406fd414c327"} Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.375011 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee4acab8c4ea700e567e950ebf4148155af35dbb9e6ec7da8237406fd414c327" Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.375084 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460405-qxplf" Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.444322 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247"] Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.458122 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460360-mr247"] Jan 05 14:45:04 crc kubenswrapper[4740]: I0105 14:45:04.983558 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187b3735-b06f-4a45-884d-f8ea41af0173" path="/var/lib/kubelet/pods/187b3735-b06f-4a45-884d-f8ea41af0173/volumes" Jan 05 14:45:07 crc kubenswrapper[4740]: I0105 14:45:07.969656 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:45:07 crc kubenswrapper[4740]: E0105 14:45:07.970199 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:45:21 crc kubenswrapper[4740]: I0105 14:45:21.968212 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:45:21 crc kubenswrapper[4740]: E0105 14:45:21.969642 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:45:33 crc kubenswrapper[4740]: I0105 14:45:33.735954 4740 scope.go:117] "RemoveContainer" containerID="6320bd56c1666ea646aa87299c494e9c174c68db6e715868333a449cce98533b" Jan 05 14:45:35 crc kubenswrapper[4740]: I0105 14:45:35.968287 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:45:35 crc kubenswrapper[4740]: E0105 14:45:35.969131 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:45:51 crc kubenswrapper[4740]: I0105 14:45:51.001001 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:45:51 crc kubenswrapper[4740]: E0105 14:45:51.002165 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:45:55 crc kubenswrapper[4740]: I0105 14:45:55.900311 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tkq8k"] Jan 05 14:45:55 crc kubenswrapper[4740]: E0105 14:45:55.901470 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39795bb0-ba19-4550-9b49-541fa6e61e24" containerName="collect-profiles" Jan 05 14:45:55 crc kubenswrapper[4740]: I0105 14:45:55.901486 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="39795bb0-ba19-4550-9b49-541fa6e61e24" containerName="collect-profiles" Jan 05 14:45:55 crc kubenswrapper[4740]: I0105 14:45:55.901800 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="39795bb0-ba19-4550-9b49-541fa6e61e24" containerName="collect-profiles" Jan 05 14:45:55 crc kubenswrapper[4740]: I0105 14:45:55.904274 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:55 crc kubenswrapper[4740]: I0105 14:45:55.946420 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkq8k"] Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.068524 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzq2v\" (UniqueName: \"kubernetes.io/projected/e5951917-5cd1-4926-8507-164a48531499-kube-api-access-pzq2v\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.068868 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-utilities\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.068984 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-catalog-content\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.171508 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzq2v\" (UniqueName: \"kubernetes.io/projected/e5951917-5cd1-4926-8507-164a48531499-kube-api-access-pzq2v\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.172022 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-utilities\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.172225 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-catalog-content\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.172540 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-utilities\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.172594 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-catalog-content\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.195211 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzq2v\" (UniqueName: \"kubernetes.io/projected/e5951917-5cd1-4926-8507-164a48531499-kube-api-access-pzq2v\") pod \"community-operators-tkq8k\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.240545 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:45:56 crc kubenswrapper[4740]: I0105 14:45:56.793341 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkq8k"] Jan 05 14:45:57 crc kubenswrapper[4740]: I0105 14:45:57.300217 4740 generic.go:334] "Generic (PLEG): container finished" podID="e5951917-5cd1-4926-8507-164a48531499" containerID="d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77" exitCode=0 Jan 05 14:45:57 crc kubenswrapper[4740]: I0105 14:45:57.300293 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerDied","Data":"d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77"} Jan 05 14:45:57 crc kubenswrapper[4740]: I0105 14:45:57.300552 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerStarted","Data":"f428b892456e7c0f5fc5c1be3ab3ed00cbc0c2693ca0dc2f39a0491801faef71"} Jan 05 14:45:57 crc kubenswrapper[4740]: I0105 14:45:57.305801 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:45:58 crc kubenswrapper[4740]: I0105 14:45:58.316786 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerStarted","Data":"4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359"} Jan 05 14:46:00 crc kubenswrapper[4740]: I0105 14:46:00.356605 4740 generic.go:334] "Generic (PLEG): container finished" podID="e5951917-5cd1-4926-8507-164a48531499" containerID="4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359" exitCode=0 Jan 05 14:46:00 crc kubenswrapper[4740]: I0105 14:46:00.356703 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerDied","Data":"4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359"} Jan 05 14:46:02 crc kubenswrapper[4740]: I0105 14:46:02.385528 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerStarted","Data":"1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527"} Jan 05 14:46:02 crc kubenswrapper[4740]: I0105 14:46:02.413902 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tkq8k" podStartSLOduration=3.467478374 podStartE2EDuration="7.413882446s" podCreationTimestamp="2026-01-05 14:45:55 +0000 UTC" firstStartedPulling="2026-01-05 14:45:57.305407296 +0000 UTC m=+3406.612315915" lastFinishedPulling="2026-01-05 14:46:01.251811408 +0000 UTC m=+3410.558719987" observedRunningTime="2026-01-05 14:46:02.407715508 +0000 UTC m=+3411.714624097" watchObservedRunningTime="2026-01-05 14:46:02.413882446 +0000 UTC m=+3411.720791035" Jan 05 14:46:04 crc kubenswrapper[4740]: I0105 14:46:04.969673 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:46:04 crc kubenswrapper[4740]: E0105 14:46:04.971253 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:46:06 crc kubenswrapper[4740]: I0105 14:46:06.242006 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:46:06 crc kubenswrapper[4740]: I0105 14:46:06.242357 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:46:06 crc kubenswrapper[4740]: I0105 14:46:06.318730 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:46:06 crc kubenswrapper[4740]: I0105 14:46:06.484737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:46:06 crc kubenswrapper[4740]: I0105 14:46:06.575885 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkq8k"] Jan 05 14:46:08 crc kubenswrapper[4740]: I0105 14:46:08.464942 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tkq8k" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="registry-server" containerID="cri-o://1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527" gracePeriod=2 Jan 05 14:46:08 crc kubenswrapper[4740]: I0105 14:46:08.997632 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.176401 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzq2v\" (UniqueName: \"kubernetes.io/projected/e5951917-5cd1-4926-8507-164a48531499-kube-api-access-pzq2v\") pod \"e5951917-5cd1-4926-8507-164a48531499\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.176729 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-utilities\") pod \"e5951917-5cd1-4926-8507-164a48531499\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.176945 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-catalog-content\") pod \"e5951917-5cd1-4926-8507-164a48531499\" (UID: \"e5951917-5cd1-4926-8507-164a48531499\") " Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.177415 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-utilities" (OuterVolumeSpecName: "utilities") pod "e5951917-5cd1-4926-8507-164a48531499" (UID: "e5951917-5cd1-4926-8507-164a48531499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.179178 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.183960 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5951917-5cd1-4926-8507-164a48531499-kube-api-access-pzq2v" (OuterVolumeSpecName: "kube-api-access-pzq2v") pod "e5951917-5cd1-4926-8507-164a48531499" (UID: "e5951917-5cd1-4926-8507-164a48531499"). InnerVolumeSpecName "kube-api-access-pzq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.261193 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5951917-5cd1-4926-8507-164a48531499" (UID: "e5951917-5cd1-4926-8507-164a48531499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.281036 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5951917-5cd1-4926-8507-164a48531499-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.281083 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzq2v\" (UniqueName: \"kubernetes.io/projected/e5951917-5cd1-4926-8507-164a48531499-kube-api-access-pzq2v\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.486915 4740 generic.go:334] "Generic (PLEG): container finished" podID="e5951917-5cd1-4926-8507-164a48531499" containerID="1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527" exitCode=0 Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.486984 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkq8k" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.486966 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerDied","Data":"1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527"} Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.487106 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkq8k" event={"ID":"e5951917-5cd1-4926-8507-164a48531499","Type":"ContainerDied","Data":"f428b892456e7c0f5fc5c1be3ab3ed00cbc0c2693ca0dc2f39a0491801faef71"} Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.487134 4740 scope.go:117] "RemoveContainer" containerID="1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.525110 4740 scope.go:117] "RemoveContainer" containerID="4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.535214 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkq8k"] Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.546759 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tkq8k"] Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.558355 4740 scope.go:117] "RemoveContainer" containerID="d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.620233 4740 scope.go:117] "RemoveContainer" containerID="1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527" Jan 05 14:46:09 crc kubenswrapper[4740]: E0105 14:46:09.620628 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527\": container with ID starting with 1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527 not found: ID does not exist" containerID="1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.620662 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527"} err="failed to get container status \"1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527\": rpc error: code = NotFound desc = could not find container \"1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527\": container with ID starting with 1731a3e9c65c583a6c71187955837316fa3ffef3500888da460d38cb41735527 not found: ID does not exist" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.620686 4740 scope.go:117] "RemoveContainer" containerID="4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359" Jan 05 14:46:09 crc kubenswrapper[4740]: E0105 14:46:09.621097 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359\": container with ID starting with 4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359 not found: ID does not exist" containerID="4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.621210 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359"} err="failed to get container status \"4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359\": rpc error: code = NotFound desc = could not find container \"4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359\": container with ID starting with 4667c599db2c749b140f2bdd946209b0c287c5cb8e5a5176a8b1bc8254385359 not found: ID does not exist" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.621327 4740 scope.go:117] "RemoveContainer" containerID="d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77" Jan 05 14:46:09 crc kubenswrapper[4740]: E0105 14:46:09.621835 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77\": container with ID starting with d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77 not found: ID does not exist" containerID="d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77" Jan 05 14:46:09 crc kubenswrapper[4740]: I0105 14:46:09.621867 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77"} err="failed to get container status \"d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77\": rpc error: code = NotFound desc = could not find container \"d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77\": container with ID starting with d7a711a10812bc02607257080c8944a522998e4db06b9284c3dbc387cff64e77 not found: ID does not exist" Jan 05 14:46:10 crc kubenswrapper[4740]: I0105 14:46:10.992789 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5951917-5cd1-4926-8507-164a48531499" path="/var/lib/kubelet/pods/e5951917-5cd1-4926-8507-164a48531499/volumes" Jan 05 14:46:15 crc kubenswrapper[4740]: I0105 14:46:15.968743 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:46:15 crc kubenswrapper[4740]: E0105 14:46:15.969813 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.934573 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8q5ns"] Jan 05 14:46:22 crc kubenswrapper[4740]: E0105 14:46:22.935644 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="extract-content" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.935657 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="extract-content" Jan 05 14:46:22 crc kubenswrapper[4740]: E0105 14:46:22.935669 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="extract-utilities" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.935676 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="extract-utilities" Jan 05 14:46:22 crc kubenswrapper[4740]: E0105 14:46:22.935697 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="registry-server" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.935704 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="registry-server" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.935947 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5951917-5cd1-4926-8507-164a48531499" containerName="registry-server" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.937714 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:22 crc kubenswrapper[4740]: I0105 14:46:22.949723 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q5ns"] Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.083929 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e38ea3c-6147-4049-a885-c9a247a5697c-catalog-content\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.084256 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e38ea3c-6147-4049-a885-c9a247a5697c-utilities\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.085023 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx7pp\" (UniqueName: \"kubernetes.io/projected/4e38ea3c-6147-4049-a885-c9a247a5697c-kube-api-access-gx7pp\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.188522 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx7pp\" (UniqueName: \"kubernetes.io/projected/4e38ea3c-6147-4049-a885-c9a247a5697c-kube-api-access-gx7pp\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.188735 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e38ea3c-6147-4049-a885-c9a247a5697c-catalog-content\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.188884 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e38ea3c-6147-4049-a885-c9a247a5697c-utilities\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.189514 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e38ea3c-6147-4049-a885-c9a247a5697c-catalog-content\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.189536 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e38ea3c-6147-4049-a885-c9a247a5697c-utilities\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.210948 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx7pp\" (UniqueName: \"kubernetes.io/projected/4e38ea3c-6147-4049-a885-c9a247a5697c-kube-api-access-gx7pp\") pod \"redhat-marketplace-8q5ns\" (UID: \"4e38ea3c-6147-4049-a885-c9a247a5697c\") " pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.282800 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:23 crc kubenswrapper[4740]: I0105 14:46:23.873003 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q5ns"] Jan 05 14:46:24 crc kubenswrapper[4740]: I0105 14:46:24.702490 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e38ea3c-6147-4049-a885-c9a247a5697c" containerID="0bc0e79d8eaf868d73ab61ffda63afa88aa9e87417506a44c44bb8b846028210" exitCode=0 Jan 05 14:46:24 crc kubenswrapper[4740]: I0105 14:46:24.702573 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q5ns" event={"ID":"4e38ea3c-6147-4049-a885-c9a247a5697c","Type":"ContainerDied","Data":"0bc0e79d8eaf868d73ab61ffda63afa88aa9e87417506a44c44bb8b846028210"} Jan 05 14:46:24 crc kubenswrapper[4740]: I0105 14:46:24.702785 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q5ns" event={"ID":"4e38ea3c-6147-4049-a885-c9a247a5697c","Type":"ContainerStarted","Data":"6d9486eff77e02c6562736ba1811375fd18eef0f0e8e0738574a9332764fb87d"} Jan 05 14:46:28 crc kubenswrapper[4740]: I0105 14:46:28.762444 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q5ns" event={"ID":"4e38ea3c-6147-4049-a885-c9a247a5697c","Type":"ContainerStarted","Data":"dfebe62697c2745eb4d9a1933ad65f8e9b106cbcf07048c99b6c476547637fe3"} Jan 05 14:46:28 crc kubenswrapper[4740]: I0105 14:46:28.969491 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:46:28 crc kubenswrapper[4740]: E0105 14:46:28.970110 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:46:29 crc kubenswrapper[4740]: I0105 14:46:29.779191 4740 generic.go:334] "Generic (PLEG): container finished" podID="4e38ea3c-6147-4049-a885-c9a247a5697c" containerID="dfebe62697c2745eb4d9a1933ad65f8e9b106cbcf07048c99b6c476547637fe3" exitCode=0 Jan 05 14:46:29 crc kubenswrapper[4740]: I0105 14:46:29.779275 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q5ns" event={"ID":"4e38ea3c-6147-4049-a885-c9a247a5697c","Type":"ContainerDied","Data":"dfebe62697c2745eb4d9a1933ad65f8e9b106cbcf07048c99b6c476547637fe3"} Jan 05 14:46:30 crc kubenswrapper[4740]: I0105 14:46:30.806616 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8q5ns" event={"ID":"4e38ea3c-6147-4049-a885-c9a247a5697c","Type":"ContainerStarted","Data":"4dc639a930fffcf3d1cd67c7c1cfd8933bc449979078a812c2d3c401570969d7"} Jan 05 14:46:30 crc kubenswrapper[4740]: I0105 14:46:30.831786 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8q5ns" podStartSLOduration=3.215730774 podStartE2EDuration="8.831765697s" podCreationTimestamp="2026-01-05 14:46:22 +0000 UTC" firstStartedPulling="2026-01-05 14:46:24.705714164 +0000 UTC m=+3434.012622773" lastFinishedPulling="2026-01-05 14:46:30.321749107 +0000 UTC m=+3439.628657696" observedRunningTime="2026-01-05 14:46:30.824852672 +0000 UTC m=+3440.131761261" watchObservedRunningTime="2026-01-05 14:46:30.831765697 +0000 UTC m=+3440.138674276" Jan 05 14:46:33 crc kubenswrapper[4740]: I0105 14:46:33.283735 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:33 crc kubenswrapper[4740]: I0105 14:46:33.284854 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:33 crc kubenswrapper[4740]: I0105 14:46:33.381922 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:40 crc kubenswrapper[4740]: I0105 14:46:40.983323 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:46:41 crc kubenswrapper[4740]: I0105 14:46:41.955747 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"f437b95577e0870ce99465071c031d6e21d5a246939c91d667331ca2a4ea584c"} Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.376872 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8q5ns" Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.490348 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8q5ns"] Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.563809 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zsln"] Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.564592 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5zsln" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="registry-server" containerID="cri-o://778005c870171e4861881ee77cb9c8a6c66ecae130a74a8c31989f8ea4aa0261" gracePeriod=2 Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.583853 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gk89z"] Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.584220 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gk89z" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="registry-server" containerID="cri-o://1d873e358bf056c3c71f696db16854df0fe6e58ac27921eefda8d2ff2d3c675e" gracePeriod=2 Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.984187 4740 generic.go:334] "Generic (PLEG): container finished" podID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerID="778005c870171e4861881ee77cb9c8a6c66ecae130a74a8c31989f8ea4aa0261" exitCode=0 Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.984284 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerDied","Data":"778005c870171e4861881ee77cb9c8a6c66ecae130a74a8c31989f8ea4aa0261"} Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.986850 4740 generic.go:334] "Generic (PLEG): container finished" podID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerID="1d873e358bf056c3c71f696db16854df0fe6e58ac27921eefda8d2ff2d3c675e" exitCode=0 Jan 05 14:46:43 crc kubenswrapper[4740]: I0105 14:46:43.986913 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gk89z" event={"ID":"47adddb7-6c5d-4b97-8502-2fe887d9b8dc","Type":"ContainerDied","Data":"1d873e358bf056c3c71f696db16854df0fe6e58ac27921eefda8d2ff2d3c675e"} Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.259940 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.267740 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.312686 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4vdh\" (UniqueName: \"kubernetes.io/projected/68cdb1a1-6b8f-410a-955a-aa1077491ae7-kube-api-access-f4vdh\") pod \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.312848 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-catalog-content\") pod \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.312902 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pbjv\" (UniqueName: \"kubernetes.io/projected/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-kube-api-access-5pbjv\") pod \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.312933 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-catalog-content\") pod \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.312967 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-utilities\") pod \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\" (UID: \"68cdb1a1-6b8f-410a-955a-aa1077491ae7\") " Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.313211 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-utilities\") pod \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\" (UID: \"47adddb7-6c5d-4b97-8502-2fe887d9b8dc\") " Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.314149 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-utilities" (OuterVolumeSpecName: "utilities") pod "47adddb7-6c5d-4b97-8502-2fe887d9b8dc" (UID: "47adddb7-6c5d-4b97-8502-2fe887d9b8dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.314732 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-utilities" (OuterVolumeSpecName: "utilities") pod "68cdb1a1-6b8f-410a-955a-aa1077491ae7" (UID: "68cdb1a1-6b8f-410a-955a-aa1077491ae7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.319593 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68cdb1a1-6b8f-410a-955a-aa1077491ae7" (UID: "68cdb1a1-6b8f-410a-955a-aa1077491ae7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.321681 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-kube-api-access-5pbjv" (OuterVolumeSpecName: "kube-api-access-5pbjv") pod "47adddb7-6c5d-4b97-8502-2fe887d9b8dc" (UID: "47adddb7-6c5d-4b97-8502-2fe887d9b8dc"). InnerVolumeSpecName "kube-api-access-5pbjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.321971 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cdb1a1-6b8f-410a-955a-aa1077491ae7-kube-api-access-f4vdh" (OuterVolumeSpecName: "kube-api-access-f4vdh") pod "68cdb1a1-6b8f-410a-955a-aa1077491ae7" (UID: "68cdb1a1-6b8f-410a-955a-aa1077491ae7"). InnerVolumeSpecName "kube-api-access-f4vdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.338525 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47adddb7-6c5d-4b97-8502-2fe887d9b8dc" (UID: "47adddb7-6c5d-4b97-8502-2fe887d9b8dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.415447 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.415483 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4vdh\" (UniqueName: \"kubernetes.io/projected/68cdb1a1-6b8f-410a-955a-aa1077491ae7-kube-api-access-f4vdh\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.415495 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.415504 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pbjv\" (UniqueName: \"kubernetes.io/projected/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-kube-api-access-5pbjv\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.415511 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47adddb7-6c5d-4b97-8502-2fe887d9b8dc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.415519 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cdb1a1-6b8f-410a-955a-aa1077491ae7-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.999483 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gk89z" event={"ID":"47adddb7-6c5d-4b97-8502-2fe887d9b8dc","Type":"ContainerDied","Data":"7460ffc59b19fee7fce89939349e2a99a4bd5ba746fc48836bb6fde0a1b07081"} Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.999778 4740 scope.go:117] "RemoveContainer" containerID="1d873e358bf056c3c71f696db16854df0fe6e58ac27921eefda8d2ff2d3c675e" Jan 05 14:46:44 crc kubenswrapper[4740]: I0105 14:46:44.999629 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gk89z" Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.008512 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zsln" event={"ID":"68cdb1a1-6b8f-410a-955a-aa1077491ae7","Type":"ContainerDied","Data":"fba7f07329158a6e25d08da21eb5fb92ac4cfe62005ff9deae98958c10726151"} Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.008621 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zsln" Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.032190 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gk89z"] Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.034875 4740 scope.go:117] "RemoveContainer" containerID="793cc50c26e1809c9e0365b472f18a62b283b925b321add808b54597a9a16bda" Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.053515 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gk89z"] Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.065463 4740 scope.go:117] "RemoveContainer" containerID="699d26ba0613533b5165f4cc2e4a786d1182455eabc8a807b8c8de0e4a07a15a" Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.065753 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zsln"] Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.084333 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zsln"] Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.123261 4740 scope.go:117] "RemoveContainer" containerID="778005c870171e4861881ee77cb9c8a6c66ecae130a74a8c31989f8ea4aa0261" Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.175841 4740 scope.go:117] "RemoveContainer" containerID="52727c5539b679c68105c3c6f7c1710d018a82ee22fb3fcf24617e682c0e09ea" Jan 05 14:46:45 crc kubenswrapper[4740]: I0105 14:46:45.201693 4740 scope.go:117] "RemoveContainer" containerID="55395d17ce98197832b4910c9c23d0542e0da7a26c41a292c36bc67120c3cb05" Jan 05 14:46:46 crc kubenswrapper[4740]: I0105 14:46:46.990433 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" path="/var/lib/kubelet/pods/47adddb7-6c5d-4b97-8502-2fe887d9b8dc/volumes" Jan 05 14:46:46 crc kubenswrapper[4740]: I0105 14:46:46.992664 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" path="/var/lib/kubelet/pods/68cdb1a1-6b8f-410a-955a-aa1077491ae7/volumes" Jan 05 14:49:01 crc kubenswrapper[4740]: I0105 14:49:01.916468 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:49:01 crc kubenswrapper[4740]: I0105 14:49:01.916940 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:49:31 crc kubenswrapper[4740]: I0105 14:49:31.916254 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:49:31 crc kubenswrapper[4740]: I0105 14:49:31.917102 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:50:01 crc kubenswrapper[4740]: I0105 14:50:01.916292 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:50:01 crc kubenswrapper[4740]: I0105 14:50:01.916839 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:50:01 crc kubenswrapper[4740]: I0105 14:50:01.916875 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:50:01 crc kubenswrapper[4740]: I0105 14:50:01.917739 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f437b95577e0870ce99465071c031d6e21d5a246939c91d667331ca2a4ea584c"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:50:01 crc kubenswrapper[4740]: I0105 14:50:01.917798 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://f437b95577e0870ce99465071c031d6e21d5a246939c91d667331ca2a4ea584c" gracePeriod=600 Jan 05 14:50:02 crc kubenswrapper[4740]: I0105 14:50:02.709853 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="f437b95577e0870ce99465071c031d6e21d5a246939c91d667331ca2a4ea584c" exitCode=0 Jan 05 14:50:02 crc kubenswrapper[4740]: I0105 14:50:02.709968 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"f437b95577e0870ce99465071c031d6e21d5a246939c91d667331ca2a4ea584c"} Jan 05 14:50:02 crc kubenswrapper[4740]: I0105 14:50:02.710457 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa"} Jan 05 14:50:02 crc kubenswrapper[4740]: I0105 14:50:02.710483 4740 scope.go:117] "RemoveContainer" containerID="b1f4347da865fdc9285bd321a91fd8f79042db1ec69152d1099eca503f589246" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.679400 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5krzj"] Jan 05 14:51:44 crc kubenswrapper[4740]: E0105 14:51:44.681656 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="extract-utilities" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.681702 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="extract-utilities" Jan 05 14:51:44 crc kubenswrapper[4740]: E0105 14:51:44.681791 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="registry-server" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.681812 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="registry-server" Jan 05 14:51:44 crc kubenswrapper[4740]: E0105 14:51:44.681860 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="extract-utilities" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.681878 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="extract-utilities" Jan 05 14:51:44 crc kubenswrapper[4740]: E0105 14:51:44.681909 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="registry-server" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.681925 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="registry-server" Jan 05 14:51:44 crc kubenswrapper[4740]: E0105 14:51:44.681951 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="extract-content" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.681966 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="extract-content" Jan 05 14:51:44 crc kubenswrapper[4740]: E0105 14:51:44.682161 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="extract-content" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.682179 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="extract-content" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.682747 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cdb1a1-6b8f-410a-955a-aa1077491ae7" containerName="registry-server" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.682813 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="47adddb7-6c5d-4b97-8502-2fe887d9b8dc" containerName="registry-server" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.689387 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.699427 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5krzj"] Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.754118 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g59fb\" (UniqueName: \"kubernetes.io/projected/0e0fa729-14fe-44de-845d-b39202c84f81-kube-api-access-g59fb\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.754173 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-utilities\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.754268 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-catalog-content\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.856810 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-catalog-content\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.857119 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g59fb\" (UniqueName: \"kubernetes.io/projected/0e0fa729-14fe-44de-845d-b39202c84f81-kube-api-access-g59fb\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.857167 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-utilities\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.857436 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-catalog-content\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.857700 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-utilities\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:44 crc kubenswrapper[4740]: I0105 14:51:44.881607 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g59fb\" (UniqueName: \"kubernetes.io/projected/0e0fa729-14fe-44de-845d-b39202c84f81-kube-api-access-g59fb\") pod \"redhat-operators-5krzj\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:45 crc kubenswrapper[4740]: I0105 14:51:45.022508 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:45 crc kubenswrapper[4740]: I0105 14:51:45.552531 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5krzj"] Jan 05 14:51:46 crc kubenswrapper[4740]: I0105 14:51:46.043203 4740 generic.go:334] "Generic (PLEG): container finished" podID="0e0fa729-14fe-44de-845d-b39202c84f81" containerID="15f33dff0dbf690ec37ba47228dd8a0e0a2f4de4ab4a05e20f1db197c1476692" exitCode=0 Jan 05 14:51:46 crc kubenswrapper[4740]: I0105 14:51:46.043309 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerDied","Data":"15f33dff0dbf690ec37ba47228dd8a0e0a2f4de4ab4a05e20f1db197c1476692"} Jan 05 14:51:46 crc kubenswrapper[4740]: I0105 14:51:46.043510 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerStarted","Data":"272c4c6cc1532ece127dfd2e57ae0806bc747c5e27425bb47bd798babfb167cf"} Jan 05 14:51:46 crc kubenswrapper[4740]: I0105 14:51:46.045083 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:51:48 crc kubenswrapper[4740]: I0105 14:51:48.070434 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerStarted","Data":"1d3431a8b75ed93a31a38bd7eb3b0d11dcfc573455ec166182ef882c769bfd3c"} Jan 05 14:51:52 crc kubenswrapper[4740]: I0105 14:51:52.122621 4740 generic.go:334] "Generic (PLEG): container finished" podID="0e0fa729-14fe-44de-845d-b39202c84f81" containerID="1d3431a8b75ed93a31a38bd7eb3b0d11dcfc573455ec166182ef882c769bfd3c" exitCode=0 Jan 05 14:51:52 crc kubenswrapper[4740]: I0105 14:51:52.122687 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerDied","Data":"1d3431a8b75ed93a31a38bd7eb3b0d11dcfc573455ec166182ef882c769bfd3c"} Jan 05 14:51:53 crc kubenswrapper[4740]: I0105 14:51:53.143449 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerStarted","Data":"ea42a8f7988ca77ea12c4b65255f4f8d0a9f07132695fc0133af7dc0a4b53e20"} Jan 05 14:51:53 crc kubenswrapper[4740]: I0105 14:51:53.174832 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5krzj" podStartSLOduration=2.445254572 podStartE2EDuration="9.174807467s" podCreationTimestamp="2026-01-05 14:51:44 +0000 UTC" firstStartedPulling="2026-01-05 14:51:46.044841708 +0000 UTC m=+3755.351750287" lastFinishedPulling="2026-01-05 14:51:52.774394603 +0000 UTC m=+3762.081303182" observedRunningTime="2026-01-05 14:51:53.167374127 +0000 UTC m=+3762.474282736" watchObservedRunningTime="2026-01-05 14:51:53.174807467 +0000 UTC m=+3762.481716056" Jan 05 14:51:55 crc kubenswrapper[4740]: I0105 14:51:55.023791 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:55 crc kubenswrapper[4740]: I0105 14:51:55.024114 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:51:56 crc kubenswrapper[4740]: I0105 14:51:56.080566 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5krzj" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="registry-server" probeResult="failure" output=< Jan 05 14:51:56 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:51:56 crc kubenswrapper[4740]: > Jan 05 14:52:06 crc kubenswrapper[4740]: I0105 14:52:06.070294 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5krzj" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="registry-server" probeResult="failure" output=< Jan 05 14:52:06 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 14:52:06 crc kubenswrapper[4740]: > Jan 05 14:52:15 crc kubenswrapper[4740]: I0105 14:52:15.071539 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:52:15 crc kubenswrapper[4740]: I0105 14:52:15.138206 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:52:15 crc kubenswrapper[4740]: I0105 14:52:15.873578 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5krzj"] Jan 05 14:52:16 crc kubenswrapper[4740]: I0105 14:52:16.424541 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5krzj" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="registry-server" containerID="cri-o://ea42a8f7988ca77ea12c4b65255f4f8d0a9f07132695fc0133af7dc0a4b53e20" gracePeriod=2 Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.436327 4740 generic.go:334] "Generic (PLEG): container finished" podID="0e0fa729-14fe-44de-845d-b39202c84f81" containerID="ea42a8f7988ca77ea12c4b65255f4f8d0a9f07132695fc0133af7dc0a4b53e20" exitCode=0 Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.436406 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerDied","Data":"ea42a8f7988ca77ea12c4b65255f4f8d0a9f07132695fc0133af7dc0a4b53e20"} Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.436748 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5krzj" event={"ID":"0e0fa729-14fe-44de-845d-b39202c84f81","Type":"ContainerDied","Data":"272c4c6cc1532ece127dfd2e57ae0806bc747c5e27425bb47bd798babfb167cf"} Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.436764 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272c4c6cc1532ece127dfd2e57ae0806bc747c5e27425bb47bd798babfb167cf" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.533836 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.578779 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-utilities\") pod \"0e0fa729-14fe-44de-845d-b39202c84f81\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.578868 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g59fb\" (UniqueName: \"kubernetes.io/projected/0e0fa729-14fe-44de-845d-b39202c84f81-kube-api-access-g59fb\") pod \"0e0fa729-14fe-44de-845d-b39202c84f81\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.578941 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-catalog-content\") pod \"0e0fa729-14fe-44de-845d-b39202c84f81\" (UID: \"0e0fa729-14fe-44de-845d-b39202c84f81\") " Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.579808 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-utilities" (OuterVolumeSpecName: "utilities") pod "0e0fa729-14fe-44de-845d-b39202c84f81" (UID: "0e0fa729-14fe-44de-845d-b39202c84f81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.590272 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0fa729-14fe-44de-845d-b39202c84f81-kube-api-access-g59fb" (OuterVolumeSpecName: "kube-api-access-g59fb") pod "0e0fa729-14fe-44de-845d-b39202c84f81" (UID: "0e0fa729-14fe-44de-845d-b39202c84f81"). InnerVolumeSpecName "kube-api-access-g59fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.684777 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.684816 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g59fb\" (UniqueName: \"kubernetes.io/projected/0e0fa729-14fe-44de-845d-b39202c84f81-kube-api-access-g59fb\") on node \"crc\" DevicePath \"\"" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.730380 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e0fa729-14fe-44de-845d-b39202c84f81" (UID: "0e0fa729-14fe-44de-845d-b39202c84f81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:52:17 crc kubenswrapper[4740]: I0105 14:52:17.786913 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0fa729-14fe-44de-845d-b39202c84f81-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:52:18 crc kubenswrapper[4740]: I0105 14:52:18.452198 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5krzj" Jan 05 14:52:18 crc kubenswrapper[4740]: I0105 14:52:18.517222 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5krzj"] Jan 05 14:52:18 crc kubenswrapper[4740]: I0105 14:52:18.530543 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5krzj"] Jan 05 14:52:18 crc kubenswrapper[4740]: I0105 14:52:18.983086 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" path="/var/lib/kubelet/pods/0e0fa729-14fe-44de-845d-b39202c84f81/volumes" Jan 05 14:52:31 crc kubenswrapper[4740]: I0105 14:52:31.916691 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:52:31 crc kubenswrapper[4740]: I0105 14:52:31.917409 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:53:01 crc kubenswrapper[4740]: I0105 14:53:01.916030 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:53:01 crc kubenswrapper[4740]: I0105 14:53:01.916640 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.145665 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vtj7"] Jan 05 14:53:12 crc kubenswrapper[4740]: E0105 14:53:12.148686 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="extract-content" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.148779 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="extract-content" Jan 05 14:53:12 crc kubenswrapper[4740]: E0105 14:53:12.148877 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="extract-utilities" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.150757 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="extract-utilities" Jan 05 14:53:12 crc kubenswrapper[4740]: E0105 14:53:12.150854 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="registry-server" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.150919 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="registry-server" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.151262 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0fa729-14fe-44de-845d-b39202c84f81" containerName="registry-server" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.153642 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.158834 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vtj7"] Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.298683 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr46w\" (UniqueName: \"kubernetes.io/projected/d15be24c-53ee-4bc0-b134-13533129f6be-kube-api-access-wr46w\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.298939 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-utilities\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.299114 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-catalog-content\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.400887 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-catalog-content\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.401021 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr46w\" (UniqueName: \"kubernetes.io/projected/d15be24c-53ee-4bc0-b134-13533129f6be-kube-api-access-wr46w\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.401135 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-utilities\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.401498 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-catalog-content\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.401517 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-utilities\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.430177 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr46w\" (UniqueName: \"kubernetes.io/projected/d15be24c-53ee-4bc0-b134-13533129f6be-kube-api-access-wr46w\") pod \"certified-operators-7vtj7\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:12 crc kubenswrapper[4740]: I0105 14:53:12.489615 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:13 crc kubenswrapper[4740]: I0105 14:53:13.100645 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vtj7"] Jan 05 14:53:13 crc kubenswrapper[4740]: I0105 14:53:13.170698 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerStarted","Data":"6daecd52d8ef9ce8847876c214a154c03dfee0e1083a08e6f8189d3ae900f919"} Jan 05 14:53:14 crc kubenswrapper[4740]: I0105 14:53:14.183277 4740 generic.go:334] "Generic (PLEG): container finished" podID="d15be24c-53ee-4bc0-b134-13533129f6be" containerID="3a61824c20c03b2aabaf9cbfd37b6b7ba7808707223b00c52f37d02b095a4728" exitCode=0 Jan 05 14:53:14 crc kubenswrapper[4740]: I0105 14:53:14.183345 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerDied","Data":"3a61824c20c03b2aabaf9cbfd37b6b7ba7808707223b00c52f37d02b095a4728"} Jan 05 14:53:15 crc kubenswrapper[4740]: I0105 14:53:15.196892 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerStarted","Data":"29659aa83220ceb87229fa14615153ab56d7ca55af9abaa884a7d9e01cafd8ff"} Jan 05 14:53:17 crc kubenswrapper[4740]: I0105 14:53:17.229600 4740 generic.go:334] "Generic (PLEG): container finished" podID="d15be24c-53ee-4bc0-b134-13533129f6be" containerID="29659aa83220ceb87229fa14615153ab56d7ca55af9abaa884a7d9e01cafd8ff" exitCode=0 Jan 05 14:53:17 crc kubenswrapper[4740]: I0105 14:53:17.229736 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerDied","Data":"29659aa83220ceb87229fa14615153ab56d7ca55af9abaa884a7d9e01cafd8ff"} Jan 05 14:53:18 crc kubenswrapper[4740]: I0105 14:53:18.243147 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerStarted","Data":"5701827072b42ba2997c5cba0b1f9d9056a36af6d0a6ffb71709e7134784d1fc"} Jan 05 14:53:18 crc kubenswrapper[4740]: I0105 14:53:18.282356 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vtj7" podStartSLOduration=2.669085933 podStartE2EDuration="6.282336816s" podCreationTimestamp="2026-01-05 14:53:12 +0000 UTC" firstStartedPulling="2026-01-05 14:53:14.185512842 +0000 UTC m=+3843.492421471" lastFinishedPulling="2026-01-05 14:53:17.798763765 +0000 UTC m=+3847.105672354" observedRunningTime="2026-01-05 14:53:18.270128667 +0000 UTC m=+3847.577037256" watchObservedRunningTime="2026-01-05 14:53:18.282336816 +0000 UTC m=+3847.589245395" Jan 05 14:53:22 crc kubenswrapper[4740]: I0105 14:53:22.490155 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:22 crc kubenswrapper[4740]: I0105 14:53:22.491776 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:22 crc kubenswrapper[4740]: I0105 14:53:22.572406 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:23 crc kubenswrapper[4740]: I0105 14:53:23.383904 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:23 crc kubenswrapper[4740]: I0105 14:53:23.444006 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vtj7"] Jan 05 14:53:25 crc kubenswrapper[4740]: I0105 14:53:25.334928 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vtj7" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="registry-server" containerID="cri-o://5701827072b42ba2997c5cba0b1f9d9056a36af6d0a6ffb71709e7134784d1fc" gracePeriod=2 Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.367398 4740 generic.go:334] "Generic (PLEG): container finished" podID="d15be24c-53ee-4bc0-b134-13533129f6be" containerID="5701827072b42ba2997c5cba0b1f9d9056a36af6d0a6ffb71709e7134784d1fc" exitCode=0 Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.367783 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerDied","Data":"5701827072b42ba2997c5cba0b1f9d9056a36af6d0a6ffb71709e7134784d1fc"} Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.477702 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.621618 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-utilities\") pod \"d15be24c-53ee-4bc0-b134-13533129f6be\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.622150 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-catalog-content\") pod \"d15be24c-53ee-4bc0-b134-13533129f6be\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.622490 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr46w\" (UniqueName: \"kubernetes.io/projected/d15be24c-53ee-4bc0-b134-13533129f6be-kube-api-access-wr46w\") pod \"d15be24c-53ee-4bc0-b134-13533129f6be\" (UID: \"d15be24c-53ee-4bc0-b134-13533129f6be\") " Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.628240 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-utilities" (OuterVolumeSpecName: "utilities") pod "d15be24c-53ee-4bc0-b134-13533129f6be" (UID: "d15be24c-53ee-4bc0-b134-13533129f6be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.631142 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15be24c-53ee-4bc0-b134-13533129f6be-kube-api-access-wr46w" (OuterVolumeSpecName: "kube-api-access-wr46w") pod "d15be24c-53ee-4bc0-b134-13533129f6be" (UID: "d15be24c-53ee-4bc0-b134-13533129f6be"). InnerVolumeSpecName "kube-api-access-wr46w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.695796 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d15be24c-53ee-4bc0-b134-13533129f6be" (UID: "d15be24c-53ee-4bc0-b134-13533129f6be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.726192 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.726247 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d15be24c-53ee-4bc0-b134-13533129f6be-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:53:26 crc kubenswrapper[4740]: I0105 14:53:26.726262 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr46w\" (UniqueName: \"kubernetes.io/projected/d15be24c-53ee-4bc0-b134-13533129f6be-kube-api-access-wr46w\") on node \"crc\" DevicePath \"\"" Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.381546 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vtj7" event={"ID":"d15be24c-53ee-4bc0-b134-13533129f6be","Type":"ContainerDied","Data":"6daecd52d8ef9ce8847876c214a154c03dfee0e1083a08e6f8189d3ae900f919"} Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.381973 4740 scope.go:117] "RemoveContainer" containerID="5701827072b42ba2997c5cba0b1f9d9056a36af6d0a6ffb71709e7134784d1fc" Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.381857 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vtj7" Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.415165 4740 scope.go:117] "RemoveContainer" containerID="29659aa83220ceb87229fa14615153ab56d7ca55af9abaa884a7d9e01cafd8ff" Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.418407 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vtj7"] Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.432094 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vtj7"] Jan 05 14:53:27 crc kubenswrapper[4740]: I0105 14:53:27.476422 4740 scope.go:117] "RemoveContainer" containerID="3a61824c20c03b2aabaf9cbfd37b6b7ba7808707223b00c52f37d02b095a4728" Jan 05 14:53:28 crc kubenswrapper[4740]: I0105 14:53:28.985374 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" path="/var/lib/kubelet/pods/d15be24c-53ee-4bc0-b134-13533129f6be/volumes" Jan 05 14:53:31 crc kubenswrapper[4740]: I0105 14:53:31.915715 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 14:53:31 crc kubenswrapper[4740]: I0105 14:53:31.916360 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 14:53:31 crc kubenswrapper[4740]: I0105 14:53:31.916409 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 14:53:31 crc kubenswrapper[4740]: I0105 14:53:31.917685 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 14:53:31 crc kubenswrapper[4740]: I0105 14:53:31.917784 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" gracePeriod=600 Jan 05 14:53:32 crc kubenswrapper[4740]: E0105 14:53:32.038562 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:53:32 crc kubenswrapper[4740]: I0105 14:53:32.445967 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" exitCode=0 Jan 05 14:53:32 crc kubenswrapper[4740]: I0105 14:53:32.446048 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa"} Jan 05 14:53:32 crc kubenswrapper[4740]: I0105 14:53:32.446124 4740 scope.go:117] "RemoveContainer" containerID="f437b95577e0870ce99465071c031d6e21d5a246939c91d667331ca2a4ea584c" Jan 05 14:53:32 crc kubenswrapper[4740]: I0105 14:53:32.446953 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:53:32 crc kubenswrapper[4740]: E0105 14:53:32.447387 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:53:42 crc kubenswrapper[4740]: I0105 14:53:42.969053 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:53:42 crc kubenswrapper[4740]: E0105 14:53:42.970487 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:53:57 crc kubenswrapper[4740]: I0105 14:53:57.969805 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:53:57 crc kubenswrapper[4740]: E0105 14:53:57.970655 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:54:12 crc kubenswrapper[4740]: I0105 14:54:12.969335 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:54:12 crc kubenswrapper[4740]: E0105 14:54:12.971107 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:54:26 crc kubenswrapper[4740]: I0105 14:54:26.968706 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:54:26 crc kubenswrapper[4740]: E0105 14:54:26.969792 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:54:38 crc kubenswrapper[4740]: I0105 14:54:38.968724 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:54:38 crc kubenswrapper[4740]: E0105 14:54:38.969759 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:54:52 crc kubenswrapper[4740]: I0105 14:54:52.969239 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:54:52 crc kubenswrapper[4740]: E0105 14:54:52.969999 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:55:06 crc kubenswrapper[4740]: I0105 14:55:06.969343 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:55:06 crc kubenswrapper[4740]: E0105 14:55:06.970755 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:55:21 crc kubenswrapper[4740]: I0105 14:55:21.970629 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:55:21 crc kubenswrapper[4740]: E0105 14:55:21.972674 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:55:36 crc kubenswrapper[4740]: I0105 14:55:36.968506 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:55:36 crc kubenswrapper[4740]: E0105 14:55:36.969816 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:55:48 crc kubenswrapper[4740]: I0105 14:55:48.968609 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:55:48 crc kubenswrapper[4740]: E0105 14:55:48.969445 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:55:59 crc kubenswrapper[4740]: I0105 14:55:59.969713 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:55:59 crc kubenswrapper[4740]: E0105 14:55:59.970539 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:56:14 crc kubenswrapper[4740]: I0105 14:56:14.969527 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:56:14 crc kubenswrapper[4740]: E0105 14:56:14.970899 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:56:28 crc kubenswrapper[4740]: I0105 14:56:28.968358 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:56:28 crc kubenswrapper[4740]: E0105 14:56:28.969203 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:56:42 crc kubenswrapper[4740]: I0105 14:56:42.968571 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:56:42 crc kubenswrapper[4740]: E0105 14:56:42.969557 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:56:54 crc kubenswrapper[4740]: I0105 14:56:54.969537 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:56:54 crc kubenswrapper[4740]: E0105 14:56:54.970968 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:57:05 crc kubenswrapper[4740]: I0105 14:57:05.968431 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:57:05 crc kubenswrapper[4740]: E0105 14:57:05.969294 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:57:20 crc kubenswrapper[4740]: I0105 14:57:20.981739 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:57:20 crc kubenswrapper[4740]: E0105 14:57:20.982749 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.459459 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rnldr"] Jan 05 14:57:26 crc kubenswrapper[4740]: E0105 14:57:26.462757 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="extract-content" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.462794 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="extract-content" Jan 05 14:57:26 crc kubenswrapper[4740]: E0105 14:57:26.462814 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="extract-utilities" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.462823 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="extract-utilities" Jan 05 14:57:26 crc kubenswrapper[4740]: E0105 14:57:26.462833 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="registry-server" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.462840 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="registry-server" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.463129 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15be24c-53ee-4bc0-b134-13533129f6be" containerName="registry-server" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.465189 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.481440 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rnldr"] Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.549669 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-catalog-content\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.549724 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmdfw\" (UniqueName: \"kubernetes.io/projected/b4544660-fb90-4ea0-b0a5-b95611c2ff38-kube-api-access-mmdfw\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.550033 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-utilities\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.652544 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-catalog-content\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.652607 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmdfw\" (UniqueName: \"kubernetes.io/projected/b4544660-fb90-4ea0-b0a5-b95611c2ff38-kube-api-access-mmdfw\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.652743 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-utilities\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.653369 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-utilities\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.653398 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-catalog-content\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.673266 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmdfw\" (UniqueName: \"kubernetes.io/projected/b4544660-fb90-4ea0-b0a5-b95611c2ff38-kube-api-access-mmdfw\") pod \"community-operators-rnldr\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:26 crc kubenswrapper[4740]: I0105 14:57:26.792897 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:27 crc kubenswrapper[4740]: I0105 14:57:27.355810 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rnldr"] Jan 05 14:57:27 crc kubenswrapper[4740]: I0105 14:57:27.596591 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerStarted","Data":"6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e"} Jan 05 14:57:27 crc kubenswrapper[4740]: I0105 14:57:27.596983 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerStarted","Data":"159190c6707cc1b2186eaee1f23d7b90a3778f8dff9c0c3581b01b8566670a3d"} Jan 05 14:57:28 crc kubenswrapper[4740]: I0105 14:57:28.613033 4740 generic.go:334] "Generic (PLEG): container finished" podID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerID="6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e" exitCode=0 Jan 05 14:57:28 crc kubenswrapper[4740]: I0105 14:57:28.613182 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerDied","Data":"6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e"} Jan 05 14:57:28 crc kubenswrapper[4740]: I0105 14:57:28.615453 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 14:57:30 crc kubenswrapper[4740]: I0105 14:57:30.638859 4740 generic.go:334] "Generic (PLEG): container finished" podID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerID="b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905" exitCode=0 Jan 05 14:57:30 crc kubenswrapper[4740]: I0105 14:57:30.638899 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerDied","Data":"b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905"} Jan 05 14:57:31 crc kubenswrapper[4740]: I0105 14:57:31.658017 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerStarted","Data":"344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d"} Jan 05 14:57:31 crc kubenswrapper[4740]: I0105 14:57:31.685360 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rnldr" podStartSLOduration=3.013487274 podStartE2EDuration="5.685344452s" podCreationTimestamp="2026-01-05 14:57:26 +0000 UTC" firstStartedPulling="2026-01-05 14:57:28.615196476 +0000 UTC m=+4097.922105055" lastFinishedPulling="2026-01-05 14:57:31.287053624 +0000 UTC m=+4100.593962233" observedRunningTime="2026-01-05 14:57:31.673811838 +0000 UTC m=+4100.980720447" watchObservedRunningTime="2026-01-05 14:57:31.685344452 +0000 UTC m=+4100.992253031" Jan 05 14:57:31 crc kubenswrapper[4740]: I0105 14:57:31.971243 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:57:31 crc kubenswrapper[4740]: E0105 14:57:31.971928 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:57:36 crc kubenswrapper[4740]: I0105 14:57:36.793676 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:36 crc kubenswrapper[4740]: I0105 14:57:36.794518 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:36 crc kubenswrapper[4740]: I0105 14:57:36.860378 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:37 crc kubenswrapper[4740]: I0105 14:57:37.796056 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:37 crc kubenswrapper[4740]: I0105 14:57:37.845113 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rnldr"] Jan 05 14:57:39 crc kubenswrapper[4740]: I0105 14:57:39.759901 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rnldr" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="registry-server" containerID="cri-o://344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d" gracePeriod=2 Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.289885 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.313728 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-utilities\") pod \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.313857 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmdfw\" (UniqueName: \"kubernetes.io/projected/b4544660-fb90-4ea0-b0a5-b95611c2ff38-kube-api-access-mmdfw\") pod \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.313973 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-catalog-content\") pod \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\" (UID: \"b4544660-fb90-4ea0-b0a5-b95611c2ff38\") " Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.315598 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-utilities" (OuterVolumeSpecName: "utilities") pod "b4544660-fb90-4ea0-b0a5-b95611c2ff38" (UID: "b4544660-fb90-4ea0-b0a5-b95611c2ff38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.320578 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4544660-fb90-4ea0-b0a5-b95611c2ff38-kube-api-access-mmdfw" (OuterVolumeSpecName: "kube-api-access-mmdfw") pod "b4544660-fb90-4ea0-b0a5-b95611c2ff38" (UID: "b4544660-fb90-4ea0-b0a5-b95611c2ff38"). InnerVolumeSpecName "kube-api-access-mmdfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.388093 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4544660-fb90-4ea0-b0a5-b95611c2ff38" (UID: "b4544660-fb90-4ea0-b0a5-b95611c2ff38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.415556 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.415585 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4544660-fb90-4ea0-b0a5-b95611c2ff38-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.415597 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmdfw\" (UniqueName: \"kubernetes.io/projected/b4544660-fb90-4ea0-b0a5-b95611c2ff38-kube-api-access-mmdfw\") on node \"crc\" DevicePath \"\"" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.775244 4740 generic.go:334] "Generic (PLEG): container finished" podID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerID="344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d" exitCode=0 Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.775312 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerDied","Data":"344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d"} Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.775338 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnldr" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.775353 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnldr" event={"ID":"b4544660-fb90-4ea0-b0a5-b95611c2ff38","Type":"ContainerDied","Data":"159190c6707cc1b2186eaee1f23d7b90a3778f8dff9c0c3581b01b8566670a3d"} Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.775382 4740 scope.go:117] "RemoveContainer" containerID="344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.809084 4740 scope.go:117] "RemoveContainer" containerID="b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.832606 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rnldr"] Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.843158 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rnldr"] Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.854895 4740 scope.go:117] "RemoveContainer" containerID="6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.908807 4740 scope.go:117] "RemoveContainer" containerID="344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d" Jan 05 14:57:40 crc kubenswrapper[4740]: E0105 14:57:40.909284 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d\": container with ID starting with 344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d not found: ID does not exist" containerID="344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.909316 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d"} err="failed to get container status \"344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d\": rpc error: code = NotFound desc = could not find container \"344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d\": container with ID starting with 344bf495f9095ec90ef8437e616e2f938edf0824e21d9e9848411a3425b2876d not found: ID does not exist" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.909337 4740 scope.go:117] "RemoveContainer" containerID="b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905" Jan 05 14:57:40 crc kubenswrapper[4740]: E0105 14:57:40.909855 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905\": container with ID starting with b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905 not found: ID does not exist" containerID="b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.909927 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905"} err="failed to get container status \"b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905\": rpc error: code = NotFound desc = could not find container \"b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905\": container with ID starting with b458ebf5c0aaf7fda65fd3b9b6cbb2c80403c43c17fcbd072c4057cd087f0905 not found: ID does not exist" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.909963 4740 scope.go:117] "RemoveContainer" containerID="6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e" Jan 05 14:57:40 crc kubenswrapper[4740]: E0105 14:57:40.910426 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e\": container with ID starting with 6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e not found: ID does not exist" containerID="6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.910459 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e"} err="failed to get container status \"6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e\": rpc error: code = NotFound desc = could not find container \"6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e\": container with ID starting with 6308550e5a732ea2b22bd800045a3af8aa42f2df36d6cb247c7c94c876dab36e not found: ID does not exist" Jan 05 14:57:40 crc kubenswrapper[4740]: I0105 14:57:40.984770 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" path="/var/lib/kubelet/pods/b4544660-fb90-4ea0-b0a5-b95611c2ff38/volumes" Jan 05 14:57:42 crc kubenswrapper[4740]: I0105 14:57:42.601935 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={ Jan 05 14:57:42 crc kubenswrapper[4740]: "http": "returned status 503, expected 200" Jan 05 14:57:42 crc kubenswrapper[4740]: } Jan 05 14:57:42 crc kubenswrapper[4740]: I0105 14:57:42.602260 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 05 14:57:46 crc kubenswrapper[4740]: I0105 14:57:46.969657 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:57:46 crc kubenswrapper[4740]: E0105 14:57:46.970645 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:57:52 crc kubenswrapper[4740]: I0105 14:57:52.408042 4740 patch_prober.go:28] interesting pod/thanos-querier-78985bc954-b6gsd container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.86:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 14:57:52 crc kubenswrapper[4740]: I0105 14:57:52.408670 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" podUID="3c267f0a-b0bb-43fe-9a21-92472096a632" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 14:57:58 crc kubenswrapper[4740]: I0105 14:57:58.968191 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:57:58 crc kubenswrapper[4740]: E0105 14:57:58.969049 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:58:07 crc kubenswrapper[4740]: E0105 14:58:07.211931 4740 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.97:49594->38.102.83.97:45101: write tcp 38.102.83.97:49594->38.102.83.97:45101: write: broken pipe Jan 05 14:58:10 crc kubenswrapper[4740]: I0105 14:58:10.989337 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:58:10 crc kubenswrapper[4740]: E0105 14:58:10.992805 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:58:25 crc kubenswrapper[4740]: I0105 14:58:25.968469 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:58:25 crc kubenswrapper[4740]: E0105 14:58:25.969321 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 14:58:34 crc kubenswrapper[4740]: I0105 14:58:34.234212 4740 scope.go:117] "RemoveContainer" containerID="15f33dff0dbf690ec37ba47228dd8a0e0a2f4de4ab4a05e20f1db197c1476692" Jan 05 14:58:34 crc kubenswrapper[4740]: I0105 14:58:34.308813 4740 scope.go:117] "RemoveContainer" containerID="1d3431a8b75ed93a31a38bd7eb3b0d11dcfc573455ec166182ef882c769bfd3c" Jan 05 14:58:34 crc kubenswrapper[4740]: I0105 14:58:34.350920 4740 scope.go:117] "RemoveContainer" containerID="ea42a8f7988ca77ea12c4b65255f4f8d0a9f07132695fc0133af7dc0a4b53e20" Jan 05 14:58:37 crc kubenswrapper[4740]: I0105 14:58:37.969735 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 14:58:39 crc kubenswrapper[4740]: I0105 14:58:39.574960 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"4670c010037ec4b3a80d8c75d95e15ce9bfc5fefe1b21f8c394a8746716d6d93"} Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.046902 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z6mtn"] Jan 05 14:59:36 crc kubenswrapper[4740]: E0105 14:59:36.048343 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="extract-utilities" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.048361 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="extract-utilities" Jan 05 14:59:36 crc kubenswrapper[4740]: E0105 14:59:36.048386 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="registry-server" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.048393 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="registry-server" Jan 05 14:59:36 crc kubenswrapper[4740]: E0105 14:59:36.048441 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="extract-content" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.048449 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="extract-content" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.048760 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4544660-fb90-4ea0-b0a5-b95611c2ff38" containerName="registry-server" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.051417 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.085123 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6mtn"] Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.122735 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86lk\" (UniqueName: \"kubernetes.io/projected/c629759f-092e-4720-b5f8-a05087ce0c81-kube-api-access-s86lk\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.122861 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-catalog-content\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.122926 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-utilities\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.225503 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86lk\" (UniqueName: \"kubernetes.io/projected/c629759f-092e-4720-b5f8-a05087ce0c81-kube-api-access-s86lk\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.225667 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-catalog-content\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.225761 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-utilities\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.226500 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-utilities\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.226696 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-catalog-content\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.251831 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86lk\" (UniqueName: \"kubernetes.io/projected/c629759f-092e-4720-b5f8-a05087ce0c81-kube-api-access-s86lk\") pod \"redhat-marketplace-z6mtn\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.387884 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:36 crc kubenswrapper[4740]: I0105 14:59:36.897446 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6mtn"] Jan 05 14:59:37 crc kubenswrapper[4740]: I0105 14:59:37.301257 4740 generic.go:334] "Generic (PLEG): container finished" podID="c629759f-092e-4720-b5f8-a05087ce0c81" containerID="e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf" exitCode=0 Jan 05 14:59:37 crc kubenswrapper[4740]: I0105 14:59:37.301334 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6mtn" event={"ID":"c629759f-092e-4720-b5f8-a05087ce0c81","Type":"ContainerDied","Data":"e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf"} Jan 05 14:59:37 crc kubenswrapper[4740]: I0105 14:59:37.301499 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6mtn" event={"ID":"c629759f-092e-4720-b5f8-a05087ce0c81","Type":"ContainerStarted","Data":"e82cc95fed1534df06a204c35d5cd04340441d9d8a7efeeae2eaf4ca07af6e15"} Jan 05 14:59:39 crc kubenswrapper[4740]: I0105 14:59:39.349375 4740 generic.go:334] "Generic (PLEG): container finished" podID="c629759f-092e-4720-b5f8-a05087ce0c81" containerID="ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596" exitCode=0 Jan 05 14:59:39 crc kubenswrapper[4740]: I0105 14:59:39.349610 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6mtn" event={"ID":"c629759f-092e-4720-b5f8-a05087ce0c81","Type":"ContainerDied","Data":"ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596"} Jan 05 14:59:41 crc kubenswrapper[4740]: I0105 14:59:41.372543 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6mtn" event={"ID":"c629759f-092e-4720-b5f8-a05087ce0c81","Type":"ContainerStarted","Data":"d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba"} Jan 05 14:59:41 crc kubenswrapper[4740]: I0105 14:59:41.402112 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z6mtn" podStartSLOduration=2.937592178 podStartE2EDuration="5.402092468s" podCreationTimestamp="2026-01-05 14:59:36 +0000 UTC" firstStartedPulling="2026-01-05 14:59:37.303124835 +0000 UTC m=+4226.610033404" lastFinishedPulling="2026-01-05 14:59:39.767625115 +0000 UTC m=+4229.074533694" observedRunningTime="2026-01-05 14:59:41.393871349 +0000 UTC m=+4230.700779928" watchObservedRunningTime="2026-01-05 14:59:41.402092468 +0000 UTC m=+4230.709001067" Jan 05 14:59:46 crc kubenswrapper[4740]: I0105 14:59:46.388233 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:46 crc kubenswrapper[4740]: I0105 14:59:46.388843 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:46 crc kubenswrapper[4740]: I0105 14:59:46.465383 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:46 crc kubenswrapper[4740]: I0105 14:59:46.529712 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:46 crc kubenswrapper[4740]: I0105 14:59:46.714538 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6mtn"] Jan 05 14:59:48 crc kubenswrapper[4740]: I0105 14:59:48.473142 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z6mtn" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="registry-server" containerID="cri-o://d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba" gracePeriod=2 Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.305039 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.368104 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-catalog-content\") pod \"c629759f-092e-4720-b5f8-a05087ce0c81\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.368627 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-utilities\") pod \"c629759f-092e-4720-b5f8-a05087ce0c81\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.368981 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s86lk\" (UniqueName: \"kubernetes.io/projected/c629759f-092e-4720-b5f8-a05087ce0c81-kube-api-access-s86lk\") pod \"c629759f-092e-4720-b5f8-a05087ce0c81\" (UID: \"c629759f-092e-4720-b5f8-a05087ce0c81\") " Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.369332 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-utilities" (OuterVolumeSpecName: "utilities") pod "c629759f-092e-4720-b5f8-a05087ce0c81" (UID: "c629759f-092e-4720-b5f8-a05087ce0c81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.370021 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.383119 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c629759f-092e-4720-b5f8-a05087ce0c81-kube-api-access-s86lk" (OuterVolumeSpecName: "kube-api-access-s86lk") pod "c629759f-092e-4720-b5f8-a05087ce0c81" (UID: "c629759f-092e-4720-b5f8-a05087ce0c81"). InnerVolumeSpecName "kube-api-access-s86lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.392008 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c629759f-092e-4720-b5f8-a05087ce0c81" (UID: "c629759f-092e-4720-b5f8-a05087ce0c81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.472323 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c629759f-092e-4720-b5f8-a05087ce0c81-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.472360 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s86lk\" (UniqueName: \"kubernetes.io/projected/c629759f-092e-4720-b5f8-a05087ce0c81-kube-api-access-s86lk\") on node \"crc\" DevicePath \"\"" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.485010 4740 generic.go:334] "Generic (PLEG): container finished" podID="c629759f-092e-4720-b5f8-a05087ce0c81" containerID="d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba" exitCode=0 Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.485083 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6mtn" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.485096 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6mtn" event={"ID":"c629759f-092e-4720-b5f8-a05087ce0c81","Type":"ContainerDied","Data":"d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba"} Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.485175 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6mtn" event={"ID":"c629759f-092e-4720-b5f8-a05087ce0c81","Type":"ContainerDied","Data":"e82cc95fed1534df06a204c35d5cd04340441d9d8a7efeeae2eaf4ca07af6e15"} Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.485199 4740 scope.go:117] "RemoveContainer" containerID="d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.527028 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6mtn"] Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.527698 4740 scope.go:117] "RemoveContainer" containerID="ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.539079 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6mtn"] Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.551433 4740 scope.go:117] "RemoveContainer" containerID="e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.631879 4740 scope.go:117] "RemoveContainer" containerID="d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba" Jan 05 14:59:49 crc kubenswrapper[4740]: E0105 14:59:49.632340 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba\": container with ID starting with d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba not found: ID does not exist" containerID="d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.632382 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba"} err="failed to get container status \"d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba\": rpc error: code = NotFound desc = could not find container \"d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba\": container with ID starting with d7e054ae65d97f8c29bb39c0c1a59d437228ba415088a8ea9cd92f52b876d7ba not found: ID does not exist" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.632409 4740 scope.go:117] "RemoveContainer" containerID="ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596" Jan 05 14:59:49 crc kubenswrapper[4740]: E0105 14:59:49.632898 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596\": container with ID starting with ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596 not found: ID does not exist" containerID="ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.632926 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596"} err="failed to get container status \"ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596\": rpc error: code = NotFound desc = could not find container \"ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596\": container with ID starting with ff610ae36cc856503e3bb815abd6aacf5277e870f82c723ce6fefce796a3f596 not found: ID does not exist" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.632945 4740 scope.go:117] "RemoveContainer" containerID="e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf" Jan 05 14:59:49 crc kubenswrapper[4740]: E0105 14:59:49.633518 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf\": container with ID starting with e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf not found: ID does not exist" containerID="e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf" Jan 05 14:59:49 crc kubenswrapper[4740]: I0105 14:59:49.633552 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf"} err="failed to get container status \"e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf\": rpc error: code = NotFound desc = could not find container \"e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf\": container with ID starting with e6317db83622671c9e1292a96c727e7d956f4207661767bf1c9fb147ef5b00cf not found: ID does not exist" Jan 05 14:59:50 crc kubenswrapper[4740]: I0105 14:59:50.996980 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" path="/var/lib/kubelet/pods/c629759f-092e-4720-b5f8-a05087ce0c81/volumes" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.191240 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb"] Jan 05 15:00:00 crc kubenswrapper[4740]: E0105 15:00:00.192328 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="registry-server" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.192341 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="registry-server" Jan 05 15:00:00 crc kubenswrapper[4740]: E0105 15:00:00.192378 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="extract-utilities" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.192384 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="extract-utilities" Jan 05 15:00:00 crc kubenswrapper[4740]: E0105 15:00:00.192410 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="extract-content" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.192420 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="extract-content" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.192667 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c629759f-092e-4720-b5f8-a05087ce0c81" containerName="registry-server" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.193542 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.200610 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.200696 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.217335 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb"] Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.342018 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-config-volume\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.342109 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-secret-volume\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.342141 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skfdh\" (UniqueName: \"kubernetes.io/projected/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-kube-api-access-skfdh\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.444939 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-config-volume\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.445017 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-secret-volume\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.445041 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skfdh\" (UniqueName: \"kubernetes.io/projected/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-kube-api-access-skfdh\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.445911 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-config-volume\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.453722 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-secret-volume\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.462382 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skfdh\" (UniqueName: \"kubernetes.io/projected/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-kube-api-access-skfdh\") pod \"collect-profiles-29460420-pp6nb\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:00 crc kubenswrapper[4740]: I0105 15:00:00.538810 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:01 crc kubenswrapper[4740]: I0105 15:00:01.211486 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb"] Jan 05 15:00:01 crc kubenswrapper[4740]: I0105 15:00:01.683569 4740 generic.go:334] "Generic (PLEG): container finished" podID="aca2ab31-0fe1-4d10-b59f-07a8781e78ed" containerID="f4ceef938ce3710b66c95daa2d9db2eaab2630010a1f4941ccca386e5a9e88c6" exitCode=0 Jan 05 15:00:01 crc kubenswrapper[4740]: I0105 15:00:01.683802 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" event={"ID":"aca2ab31-0fe1-4d10-b59f-07a8781e78ed","Type":"ContainerDied","Data":"f4ceef938ce3710b66c95daa2d9db2eaab2630010a1f4941ccca386e5a9e88c6"} Jan 05 15:00:01 crc kubenswrapper[4740]: I0105 15:00:01.683828 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" event={"ID":"aca2ab31-0fe1-4d10-b59f-07a8781e78ed","Type":"ContainerStarted","Data":"076f976df16ddf5c568d46c6781fad16dd59e74ac4c5d3cc8eea1c8ac4cf4407"} Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.574736 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.706936 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" event={"ID":"aca2ab31-0fe1-4d10-b59f-07a8781e78ed","Type":"ContainerDied","Data":"076f976df16ddf5c568d46c6781fad16dd59e74ac4c5d3cc8eea1c8ac4cf4407"} Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.706984 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076f976df16ddf5c568d46c6781fad16dd59e74ac4c5d3cc8eea1c8ac4cf4407" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.707102 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460420-pp6nb" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.735196 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skfdh\" (UniqueName: \"kubernetes.io/projected/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-kube-api-access-skfdh\") pod \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.735393 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-config-volume\") pod \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.735553 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-secret-volume\") pod \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\" (UID: \"aca2ab31-0fe1-4d10-b59f-07a8781e78ed\") " Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.736264 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-config-volume" (OuterVolumeSpecName: "config-volume") pod "aca2ab31-0fe1-4d10-b59f-07a8781e78ed" (UID: "aca2ab31-0fe1-4d10-b59f-07a8781e78ed"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.742420 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aca2ab31-0fe1-4d10-b59f-07a8781e78ed" (UID: "aca2ab31-0fe1-4d10-b59f-07a8781e78ed"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.745717 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-kube-api-access-skfdh" (OuterVolumeSpecName: "kube-api-access-skfdh") pod "aca2ab31-0fe1-4d10-b59f-07a8781e78ed" (UID: "aca2ab31-0fe1-4d10-b59f-07a8781e78ed"). InnerVolumeSpecName "kube-api-access-skfdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.838753 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.838784 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skfdh\" (UniqueName: \"kubernetes.io/projected/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-kube-api-access-skfdh\") on node \"crc\" DevicePath \"\"" Jan 05 15:00:03 crc kubenswrapper[4740]: I0105 15:00:03.838793 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aca2ab31-0fe1-4d10-b59f-07a8781e78ed-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 15:00:04 crc kubenswrapper[4740]: I0105 15:00:04.652906 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p"] Jan 05 15:00:04 crc kubenswrapper[4740]: I0105 15:00:04.665529 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460375-xwp9p"] Jan 05 15:00:04 crc kubenswrapper[4740]: I0105 15:00:04.987705 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0120f59f-644d-4257-ab41-20f87b94c07e" path="/var/lib/kubelet/pods/0120f59f-644d-4257-ab41-20f87b94c07e/volumes" Jan 05 15:00:34 crc kubenswrapper[4740]: I0105 15:00:34.493022 4740 scope.go:117] "RemoveContainer" containerID="9f390851fdde689d96b612fdd0d1cd24e7e9197f0bb49313a0bc71c2f3406d5e" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.171698 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29460421-bppnn"] Jan 05 15:01:00 crc kubenswrapper[4740]: E0105 15:01:00.172750 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca2ab31-0fe1-4d10-b59f-07a8781e78ed" containerName="collect-profiles" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.172767 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca2ab31-0fe1-4d10-b59f-07a8781e78ed" containerName="collect-profiles" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.173042 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca2ab31-0fe1-4d10-b59f-07a8781e78ed" containerName="collect-profiles" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.179523 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.227212 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29460421-bppnn"] Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.275394 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qlrs\" (UniqueName: \"kubernetes.io/projected/c1324171-d9ac-4ee9-8fae-36557d38ad3e-kube-api-access-2qlrs\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.275544 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-config-data\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.275729 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-combined-ca-bundle\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.275870 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-fernet-keys\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.377633 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-fernet-keys\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.377784 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qlrs\" (UniqueName: \"kubernetes.io/projected/c1324171-d9ac-4ee9-8fae-36557d38ad3e-kube-api-access-2qlrs\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.377843 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-config-data\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.377995 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-combined-ca-bundle\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.384638 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-config-data\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.385116 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-combined-ca-bundle\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.385991 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-fernet-keys\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.401438 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qlrs\" (UniqueName: \"kubernetes.io/projected/c1324171-d9ac-4ee9-8fae-36557d38ad3e-kube-api-access-2qlrs\") pod \"keystone-cron-29460421-bppnn\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:00 crc kubenswrapper[4740]: I0105 15:01:00.505362 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:01 crc kubenswrapper[4740]: I0105 15:01:01.049919 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29460421-bppnn"] Jan 05 15:01:01 crc kubenswrapper[4740]: I0105 15:01:01.489873 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29460421-bppnn" event={"ID":"c1324171-d9ac-4ee9-8fae-36557d38ad3e","Type":"ContainerStarted","Data":"5dbc82d4a3f7f6e5f06fa4e90f63c214dcc862eb06f405ce6234e51c5899b8d7"} Jan 05 15:01:01 crc kubenswrapper[4740]: I0105 15:01:01.490241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29460421-bppnn" event={"ID":"c1324171-d9ac-4ee9-8fae-36557d38ad3e","Type":"ContainerStarted","Data":"af8f25c3f77872e7f9d734df4df3f521ec93c15ec48533e481463e5791116dbb"} Jan 05 15:01:01 crc kubenswrapper[4740]: I0105 15:01:01.519116 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29460421-bppnn" podStartSLOduration=1.51908917 podStartE2EDuration="1.51908917s" podCreationTimestamp="2026-01-05 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 15:01:01.510960743 +0000 UTC m=+4310.817869322" watchObservedRunningTime="2026-01-05 15:01:01.51908917 +0000 UTC m=+4310.825997759" Jan 05 15:01:01 crc kubenswrapper[4740]: I0105 15:01:01.915909 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:01:01 crc kubenswrapper[4740]: I0105 15:01:01.916004 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:01:04 crc kubenswrapper[4740]: I0105 15:01:04.536275 4740 generic.go:334] "Generic (PLEG): container finished" podID="c1324171-d9ac-4ee9-8fae-36557d38ad3e" containerID="5dbc82d4a3f7f6e5f06fa4e90f63c214dcc862eb06f405ce6234e51c5899b8d7" exitCode=0 Jan 05 15:01:04 crc kubenswrapper[4740]: I0105 15:01:04.536416 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29460421-bppnn" event={"ID":"c1324171-d9ac-4ee9-8fae-36557d38ad3e","Type":"ContainerDied","Data":"5dbc82d4a3f7f6e5f06fa4e90f63c214dcc862eb06f405ce6234e51c5899b8d7"} Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.220464 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.257285 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-fernet-keys\") pod \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.257501 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-combined-ca-bundle\") pod \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.257572 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qlrs\" (UniqueName: \"kubernetes.io/projected/c1324171-d9ac-4ee9-8fae-36557d38ad3e-kube-api-access-2qlrs\") pod \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.257780 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-config-data\") pod \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\" (UID: \"c1324171-d9ac-4ee9-8fae-36557d38ad3e\") " Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.264164 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1324171-d9ac-4ee9-8fae-36557d38ad3e-kube-api-access-2qlrs" (OuterVolumeSpecName: "kube-api-access-2qlrs") pod "c1324171-d9ac-4ee9-8fae-36557d38ad3e" (UID: "c1324171-d9ac-4ee9-8fae-36557d38ad3e"). InnerVolumeSpecName "kube-api-access-2qlrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.265815 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c1324171-d9ac-4ee9-8fae-36557d38ad3e" (UID: "c1324171-d9ac-4ee9-8fae-36557d38ad3e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.292595 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1324171-d9ac-4ee9-8fae-36557d38ad3e" (UID: "c1324171-d9ac-4ee9-8fae-36557d38ad3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.320940 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-config-data" (OuterVolumeSpecName: "config-data") pod "c1324171-d9ac-4ee9-8fae-36557d38ad3e" (UID: "c1324171-d9ac-4ee9-8fae-36557d38ad3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.360801 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.360832 4740 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.360841 4740 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1324171-d9ac-4ee9-8fae-36557d38ad3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.360854 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qlrs\" (UniqueName: \"kubernetes.io/projected/c1324171-d9ac-4ee9-8fae-36557d38ad3e-kube-api-access-2qlrs\") on node \"crc\" DevicePath \"\"" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.566432 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29460421-bppnn" event={"ID":"c1324171-d9ac-4ee9-8fae-36557d38ad3e","Type":"ContainerDied","Data":"af8f25c3f77872e7f9d734df4df3f521ec93c15ec48533e481463e5791116dbb"} Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.566489 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af8f25c3f77872e7f9d734df4df3f521ec93c15ec48533e481463e5791116dbb" Jan 05 15:01:06 crc kubenswrapper[4740]: I0105 15:01:06.566563 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29460421-bppnn" Jan 05 15:01:31 crc kubenswrapper[4740]: I0105 15:01:31.915626 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:01:31 crc kubenswrapper[4740]: I0105 15:01:31.916340 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.082244 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lpth4"] Jan 05 15:01:49 crc kubenswrapper[4740]: E0105 15:01:49.083336 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1324171-d9ac-4ee9-8fae-36557d38ad3e" containerName="keystone-cron" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.083351 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1324171-d9ac-4ee9-8fae-36557d38ad3e" containerName="keystone-cron" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.083805 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1324171-d9ac-4ee9-8fae-36557d38ad3e" containerName="keystone-cron" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.085808 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.094935 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpth4"] Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.199270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-catalog-content\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.199345 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-646dj\" (UniqueName: \"kubernetes.io/projected/1e5b96ea-417f-42c5-a1c3-0621726b08a1-kube-api-access-646dj\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.199889 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-utilities\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.302368 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-utilities\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.302545 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-catalog-content\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.302588 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-646dj\" (UniqueName: \"kubernetes.io/projected/1e5b96ea-417f-42c5-a1c3-0621726b08a1-kube-api-access-646dj\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.303664 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-utilities\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.303935 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-catalog-content\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.326314 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-646dj\" (UniqueName: \"kubernetes.io/projected/1e5b96ea-417f-42c5-a1c3-0621726b08a1-kube-api-access-646dj\") pod \"redhat-operators-lpth4\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.413399 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:01:49 crc kubenswrapper[4740]: I0105 15:01:49.967824 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpth4"] Jan 05 15:01:50 crc kubenswrapper[4740]: I0105 15:01:50.321141 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerStarted","Data":"f90ecdbc4188e3db071335ae06c7f933e503cbda5ff3251880466b15875be778"} Jan 05 15:01:51 crc kubenswrapper[4740]: I0105 15:01:51.332893 4740 generic.go:334] "Generic (PLEG): container finished" podID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerID="d9abe246e790cd2f927f3d104c072b267f37a29b35a18f7dfd4aeb6e80ec7121" exitCode=0 Jan 05 15:01:51 crc kubenswrapper[4740]: I0105 15:01:51.332988 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerDied","Data":"d9abe246e790cd2f927f3d104c072b267f37a29b35a18f7dfd4aeb6e80ec7121"} Jan 05 15:01:53 crc kubenswrapper[4740]: I0105 15:01:53.367576 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerStarted","Data":"bbb778c40d131926b8af5a3c3f832b6ddebb10f64e7ef749b113afa54d06ac7e"} Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.458774 4740 generic.go:334] "Generic (PLEG): container finished" podID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerID="bbb778c40d131926b8af5a3c3f832b6ddebb10f64e7ef749b113afa54d06ac7e" exitCode=0 Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.458874 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerDied","Data":"bbb778c40d131926b8af5a3c3f832b6ddebb10f64e7ef749b113afa54d06ac7e"} Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.916266 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.916344 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.916400 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.917461 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4670c010037ec4b3a80d8c75d95e15ce9bfc5fefe1b21f8c394a8746716d6d93"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 15:02:01 crc kubenswrapper[4740]: I0105 15:02:01.917537 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://4670c010037ec4b3a80d8c75d95e15ce9bfc5fefe1b21f8c394a8746716d6d93" gracePeriod=600 Jan 05 15:02:02 crc kubenswrapper[4740]: I0105 15:02:02.477137 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerStarted","Data":"0587ff30b3c50d85ad4078f5136d02adcd8e8a160bfe58a86329ed1eebb6cf9f"} Jan 05 15:02:02 crc kubenswrapper[4740]: I0105 15:02:02.491025 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="4670c010037ec4b3a80d8c75d95e15ce9bfc5fefe1b21f8c394a8746716d6d93" exitCode=0 Jan 05 15:02:02 crc kubenswrapper[4740]: I0105 15:02:02.491091 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"4670c010037ec4b3a80d8c75d95e15ce9bfc5fefe1b21f8c394a8746716d6d93"} Jan 05 15:02:02 crc kubenswrapper[4740]: I0105 15:02:02.491129 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be"} Jan 05 15:02:02 crc kubenswrapper[4740]: I0105 15:02:02.491150 4740 scope.go:117] "RemoveContainer" containerID="9071547b9569bd186096ba9726f8aa25b88b46f4220691d7c538f5c3fd2017fa" Jan 05 15:02:02 crc kubenswrapper[4740]: I0105 15:02:02.511263 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lpth4" podStartSLOduration=2.885858854 podStartE2EDuration="13.511241354s" podCreationTimestamp="2026-01-05 15:01:49 +0000 UTC" firstStartedPulling="2026-01-05 15:01:51.335347479 +0000 UTC m=+4360.642256058" lastFinishedPulling="2026-01-05 15:02:01.960729959 +0000 UTC m=+4371.267638558" observedRunningTime="2026-01-05 15:02:02.506382674 +0000 UTC m=+4371.813291263" watchObservedRunningTime="2026-01-05 15:02:02.511241354 +0000 UTC m=+4371.818149943" Jan 05 15:02:09 crc kubenswrapper[4740]: I0105 15:02:09.415680 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:02:09 crc kubenswrapper[4740]: I0105 15:02:09.416262 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:02:10 crc kubenswrapper[4740]: I0105 15:02:10.473631 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lpth4" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="registry-server" probeResult="failure" output=< Jan 05 15:02:10 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:02:10 crc kubenswrapper[4740]: > Jan 05 15:02:19 crc kubenswrapper[4740]: I0105 15:02:19.468299 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:02:19 crc kubenswrapper[4740]: I0105 15:02:19.523278 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:02:20 crc kubenswrapper[4740]: I0105 15:02:20.293205 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpth4"] Jan 05 15:02:20 crc kubenswrapper[4740]: I0105 15:02:20.740783 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lpth4" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="registry-server" containerID="cri-o://0587ff30b3c50d85ad4078f5136d02adcd8e8a160bfe58a86329ed1eebb6cf9f" gracePeriod=2 Jan 05 15:02:21 crc kubenswrapper[4740]: I0105 15:02:21.766237 4740 generic.go:334] "Generic (PLEG): container finished" podID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerID="0587ff30b3c50d85ad4078f5136d02adcd8e8a160bfe58a86329ed1eebb6cf9f" exitCode=0 Jan 05 15:02:21 crc kubenswrapper[4740]: I0105 15:02:21.766351 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerDied","Data":"0587ff30b3c50d85ad4078f5136d02adcd8e8a160bfe58a86329ed1eebb6cf9f"} Jan 05 15:02:21 crc kubenswrapper[4740]: I0105 15:02:21.955292 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.100553 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-utilities\") pod \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.100822 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-646dj\" (UniqueName: \"kubernetes.io/projected/1e5b96ea-417f-42c5-a1c3-0621726b08a1-kube-api-access-646dj\") pod \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.100866 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-catalog-content\") pod \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\" (UID: \"1e5b96ea-417f-42c5-a1c3-0621726b08a1\") " Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.102419 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-utilities" (OuterVolumeSpecName: "utilities") pod "1e5b96ea-417f-42c5-a1c3-0621726b08a1" (UID: "1e5b96ea-417f-42c5-a1c3-0621726b08a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.131371 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5b96ea-417f-42c5-a1c3-0621726b08a1-kube-api-access-646dj" (OuterVolumeSpecName: "kube-api-access-646dj") pod "1e5b96ea-417f-42c5-a1c3-0621726b08a1" (UID: "1e5b96ea-417f-42c5-a1c3-0621726b08a1"). InnerVolumeSpecName "kube-api-access-646dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.203790 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.203822 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-646dj\" (UniqueName: \"kubernetes.io/projected/1e5b96ea-417f-42c5-a1c3-0621726b08a1-kube-api-access-646dj\") on node \"crc\" DevicePath \"\"" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.215319 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e5b96ea-417f-42c5-a1c3-0621726b08a1" (UID: "1e5b96ea-417f-42c5-a1c3-0621726b08a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.305753 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5b96ea-417f-42c5-a1c3-0621726b08a1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.782397 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpth4" event={"ID":"1e5b96ea-417f-42c5-a1c3-0621726b08a1","Type":"ContainerDied","Data":"f90ecdbc4188e3db071335ae06c7f933e503cbda5ff3251880466b15875be778"} Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.782469 4740 scope.go:117] "RemoveContainer" containerID="0587ff30b3c50d85ad4078f5136d02adcd8e8a160bfe58a86329ed1eebb6cf9f" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.782495 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpth4" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.811960 4740 scope.go:117] "RemoveContainer" containerID="bbb778c40d131926b8af5a3c3f832b6ddebb10f64e7ef749b113afa54d06ac7e" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.850291 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpth4"] Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.857439 4740 scope.go:117] "RemoveContainer" containerID="d9abe246e790cd2f927f3d104c072b267f37a29b35a18f7dfd4aeb6e80ec7121" Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.862605 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lpth4"] Jan 05 15:02:22 crc kubenswrapper[4740]: I0105 15:02:22.980507 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" path="/var/lib/kubelet/pods/1e5b96ea-417f-42c5-a1c3-0621726b08a1/volumes" Jan 05 15:03:00 crc kubenswrapper[4740]: I0105 15:03:00.232477 4740 trace.go:236] Trace[370501028]: "Calculate volume metrics of trusted-ca for pod openshift-logging/collector-bx9tf" (05-Jan-2026 15:02:57.410) (total time: 2821ms): Jan 05 15:03:00 crc kubenswrapper[4740]: Trace[370501028]: [2.821273526s] [2.821273526s] END Jan 05 15:04:31 crc kubenswrapper[4740]: I0105 15:04:31.916410 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:04:31 crc kubenswrapper[4740]: I0105 15:04:31.916940 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.149500 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-thc5t"] Jan 05 15:04:33 crc kubenswrapper[4740]: E0105 15:04:33.150675 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="extract-utilities" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.150693 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="extract-utilities" Jan 05 15:04:33 crc kubenswrapper[4740]: E0105 15:04:33.150716 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="registry-server" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.150724 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="registry-server" Jan 05 15:04:33 crc kubenswrapper[4740]: E0105 15:04:33.150759 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="extract-content" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.150768 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="extract-content" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.151136 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5b96ea-417f-42c5-a1c3-0621726b08a1" containerName="registry-server" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.153370 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.165805 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thc5t"] Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.311642 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxvkj\" (UniqueName: \"kubernetes.io/projected/2c2a9ef2-2c69-4853-99cc-9729c2805537-kube-api-access-qxvkj\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.312029 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-catalog-content\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.312748 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-utilities\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.414945 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-catalog-content\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.415187 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-utilities\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.415329 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxvkj\" (UniqueName: \"kubernetes.io/projected/2c2a9ef2-2c69-4853-99cc-9729c2805537-kube-api-access-qxvkj\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.415703 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-catalog-content\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.415745 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-utilities\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.446376 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxvkj\" (UniqueName: \"kubernetes.io/projected/2c2a9ef2-2c69-4853-99cc-9729c2805537-kube-api-access-qxvkj\") pod \"certified-operators-thc5t\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:33 crc kubenswrapper[4740]: I0105 15:04:33.491867 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:34 crc kubenswrapper[4740]: I0105 15:04:34.098124 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thc5t"] Jan 05 15:04:34 crc kubenswrapper[4740]: I0105 15:04:34.761270 4740 generic.go:334] "Generic (PLEG): container finished" podID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerID="d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63" exitCode=0 Jan 05 15:04:34 crc kubenswrapper[4740]: I0105 15:04:34.761365 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerDied","Data":"d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63"} Jan 05 15:04:34 crc kubenswrapper[4740]: I0105 15:04:34.761592 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerStarted","Data":"50bb1b21c0f626daceea34b0543d8ce46f85746bda2cb6b5f623ae79f07d004f"} Jan 05 15:04:34 crc kubenswrapper[4740]: I0105 15:04:34.764127 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 15:04:36 crc kubenswrapper[4740]: I0105 15:04:36.793173 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerStarted","Data":"4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478"} Jan 05 15:04:37 crc kubenswrapper[4740]: I0105 15:04:37.810692 4740 generic.go:334] "Generic (PLEG): container finished" podID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerID="4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478" exitCode=0 Jan 05 15:04:37 crc kubenswrapper[4740]: I0105 15:04:37.810944 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerDied","Data":"4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478"} Jan 05 15:04:39 crc kubenswrapper[4740]: I0105 15:04:39.840348 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerStarted","Data":"467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a"} Jan 05 15:04:39 crc kubenswrapper[4740]: I0105 15:04:39.877985 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-thc5t" podStartSLOduration=3.037755681 podStartE2EDuration="6.877961559s" podCreationTimestamp="2026-01-05 15:04:33 +0000 UTC" firstStartedPulling="2026-01-05 15:04:34.763904674 +0000 UTC m=+4524.070813253" lastFinishedPulling="2026-01-05 15:04:38.604110542 +0000 UTC m=+4527.911019131" observedRunningTime="2026-01-05 15:04:39.861453358 +0000 UTC m=+4529.168361947" watchObservedRunningTime="2026-01-05 15:04:39.877961559 +0000 UTC m=+4529.184870148" Jan 05 15:04:43 crc kubenswrapper[4740]: I0105 15:04:43.492325 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:43 crc kubenswrapper[4740]: I0105 15:04:43.493137 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:43 crc kubenswrapper[4740]: I0105 15:04:43.548700 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:43 crc kubenswrapper[4740]: I0105 15:04:43.965513 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:44 crc kubenswrapper[4740]: I0105 15:04:44.020404 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thc5t"] Jan 05 15:04:45 crc kubenswrapper[4740]: I0105 15:04:45.915048 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-thc5t" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="registry-server" containerID="cri-o://467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a" gracePeriod=2 Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.483845 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.551016 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxvkj\" (UniqueName: \"kubernetes.io/projected/2c2a9ef2-2c69-4853-99cc-9729c2805537-kube-api-access-qxvkj\") pod \"2c2a9ef2-2c69-4853-99cc-9729c2805537\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.551197 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-utilities\") pod \"2c2a9ef2-2c69-4853-99cc-9729c2805537\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.551313 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-catalog-content\") pod \"2c2a9ef2-2c69-4853-99cc-9729c2805537\" (UID: \"2c2a9ef2-2c69-4853-99cc-9729c2805537\") " Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.553572 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-utilities" (OuterVolumeSpecName: "utilities") pod "2c2a9ef2-2c69-4853-99cc-9729c2805537" (UID: "2c2a9ef2-2c69-4853-99cc-9729c2805537"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.561270 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2a9ef2-2c69-4853-99cc-9729c2805537-kube-api-access-qxvkj" (OuterVolumeSpecName: "kube-api-access-qxvkj") pod "2c2a9ef2-2c69-4853-99cc-9729c2805537" (UID: "2c2a9ef2-2c69-4853-99cc-9729c2805537"). InnerVolumeSpecName "kube-api-access-qxvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.609434 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c2a9ef2-2c69-4853-99cc-9729c2805537" (UID: "2c2a9ef2-2c69-4853-99cc-9729c2805537"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.654691 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxvkj\" (UniqueName: \"kubernetes.io/projected/2c2a9ef2-2c69-4853-99cc-9729c2805537-kube-api-access-qxvkj\") on node \"crc\" DevicePath \"\"" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.654738 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.654753 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c2a9ef2-2c69-4853-99cc-9729c2805537-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.936291 4740 generic.go:334] "Generic (PLEG): container finished" podID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerID="467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a" exitCode=0 Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.936335 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thc5t" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.936393 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerDied","Data":"467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a"} Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.936752 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thc5t" event={"ID":"2c2a9ef2-2c69-4853-99cc-9729c2805537","Type":"ContainerDied","Data":"50bb1b21c0f626daceea34b0543d8ce46f85746bda2cb6b5f623ae79f07d004f"} Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.936787 4740 scope.go:117] "RemoveContainer" containerID="467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a" Jan 05 15:04:46 crc kubenswrapper[4740]: I0105 15:04:46.998577 4740 scope.go:117] "RemoveContainer" containerID="4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.011854 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thc5t"] Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.019853 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-thc5t"] Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.035440 4740 scope.go:117] "RemoveContainer" containerID="d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.112435 4740 scope.go:117] "RemoveContainer" containerID="467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a" Jan 05 15:04:47 crc kubenswrapper[4740]: E0105 15:04:47.113217 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a\": container with ID starting with 467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a not found: ID does not exist" containerID="467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.113291 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a"} err="failed to get container status \"467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a\": rpc error: code = NotFound desc = could not find container \"467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a\": container with ID starting with 467d49cfb9a63a8a77864623a8bf6a64935c137200fe5dae3997c4c24d40ad0a not found: ID does not exist" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.113333 4740 scope.go:117] "RemoveContainer" containerID="4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478" Jan 05 15:04:47 crc kubenswrapper[4740]: E0105 15:04:47.113849 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478\": container with ID starting with 4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478 not found: ID does not exist" containerID="4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.113895 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478"} err="failed to get container status \"4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478\": rpc error: code = NotFound desc = could not find container \"4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478\": container with ID starting with 4a75af9caffa866baf238f39965e0635f342434121a6eb8dadd3a26bddfa4478 not found: ID does not exist" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.113927 4740 scope.go:117] "RemoveContainer" containerID="d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63" Jan 05 15:04:47 crc kubenswrapper[4740]: E0105 15:04:47.114440 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63\": container with ID starting with d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63 not found: ID does not exist" containerID="d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63" Jan 05 15:04:47 crc kubenswrapper[4740]: I0105 15:04:47.114503 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63"} err="failed to get container status \"d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63\": rpc error: code = NotFound desc = could not find container \"d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63\": container with ID starting with d2edbb7a358e508529428156c7a4fa3ad20e859b913493f1622cd2532d6a7f63 not found: ID does not exist" Jan 05 15:04:48 crc kubenswrapper[4740]: I0105 15:04:48.997431 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" path="/var/lib/kubelet/pods/2c2a9ef2-2c69-4853-99cc-9729c2805537/volumes" Jan 05 15:05:01 crc kubenswrapper[4740]: I0105 15:05:01.916171 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:05:01 crc kubenswrapper[4740]: I0105 15:05:01.916579 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:05:31 crc kubenswrapper[4740]: I0105 15:05:31.915948 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:05:31 crc kubenswrapper[4740]: I0105 15:05:31.916670 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:05:31 crc kubenswrapper[4740]: I0105 15:05:31.916737 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 15:05:31 crc kubenswrapper[4740]: I0105 15:05:31.917797 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 15:05:31 crc kubenswrapper[4740]: I0105 15:05:31.917869 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" gracePeriod=600 Jan 05 15:05:32 crc kubenswrapper[4740]: E0105 15:05:32.546352 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:05:33 crc kubenswrapper[4740]: I0105 15:05:33.535914 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" exitCode=0 Jan 05 15:05:33 crc kubenswrapper[4740]: I0105 15:05:33.536018 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be"} Jan 05 15:05:33 crc kubenswrapper[4740]: I0105 15:05:33.537299 4740 scope.go:117] "RemoveContainer" containerID="4670c010037ec4b3a80d8c75d95e15ce9bfc5fefe1b21f8c394a8746716d6d93" Jan 05 15:05:33 crc kubenswrapper[4740]: I0105 15:05:33.538207 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:05:33 crc kubenswrapper[4740]: E0105 15:05:33.538721 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:05:44 crc kubenswrapper[4740]: I0105 15:05:44.969289 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:05:44 crc kubenswrapper[4740]: E0105 15:05:44.970455 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:05:58 crc kubenswrapper[4740]: I0105 15:05:58.968722 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:05:58 crc kubenswrapper[4740]: E0105 15:05:58.969601 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:06:12 crc kubenswrapper[4740]: I0105 15:06:12.969049 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:06:12 crc kubenswrapper[4740]: E0105 15:06:12.970116 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:06:27 crc kubenswrapper[4740]: I0105 15:06:27.967844 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:06:27 crc kubenswrapper[4740]: E0105 15:06:27.968581 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:06:42 crc kubenswrapper[4740]: I0105 15:06:42.968977 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:06:42 crc kubenswrapper[4740]: E0105 15:06:42.971132 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.012316 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 05 15:06:47 crc kubenswrapper[4740]: E0105 15:06:47.013725 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="extract-content" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.013747 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="extract-content" Jan 05 15:06:47 crc kubenswrapper[4740]: E0105 15:06:47.013799 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="registry-server" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.013807 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="registry-server" Jan 05 15:06:47 crc kubenswrapper[4740]: E0105 15:06:47.013847 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="extract-utilities" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.013856 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="extract-utilities" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.014187 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2a9ef2-2c69-4853-99cc-9729c2805537" containerName="registry-server" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.015303 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.023928 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.024515 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jv2n5" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.024821 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.024847 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.035127 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.184030 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.184760 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.185041 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.185270 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt95g\" (UniqueName: \"kubernetes.io/projected/efc1648a-7270-45c6-af93-bd4b641931d2-kube-api-access-kt95g\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.185482 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.185578 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.185741 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.185878 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-config-data\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.186153 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.288412 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.288807 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.288955 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289099 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289206 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt95g\" (UniqueName: \"kubernetes.io/projected/efc1648a-7270-45c6-af93-bd4b641931d2-kube-api-access-kt95g\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289265 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289419 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289521 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289640 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289751 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-config-data\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289848 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.289668 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.290696 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-config-data\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.290956 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.633314 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.633345 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.633689 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.649404 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt95g\" (UniqueName: \"kubernetes.io/projected/efc1648a-7270-45c6-af93-bd4b641931d2-kube-api-access-kt95g\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.674520 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " pod="openstack/tempest-tests-tempest" Jan 05 15:06:47 crc kubenswrapper[4740]: I0105 15:06:47.953940 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 05 15:06:48 crc kubenswrapper[4740]: I0105 15:06:48.438272 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 05 15:06:48 crc kubenswrapper[4740]: I0105 15:06:48.484580 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"efc1648a-7270-45c6-af93-bd4b641931d2","Type":"ContainerStarted","Data":"1844a5a25e10f8ff3d5935c66e67e7944b8553c3067d19869395ab97b855e364"} Jan 05 15:06:55 crc kubenswrapper[4740]: I0105 15:06:55.971212 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:06:55 crc kubenswrapper[4740]: E0105 15:06:55.974647 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:07:07 crc kubenswrapper[4740]: I0105 15:07:07.969505 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:07:07 crc kubenswrapper[4740]: E0105 15:07:07.970210 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:07:19 crc kubenswrapper[4740]: I0105 15:07:19.968736 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:07:19 crc kubenswrapper[4740]: E0105 15:07:19.969915 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:07:32 crc kubenswrapper[4740]: I0105 15:07:32.968024 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:07:32 crc kubenswrapper[4740]: E0105 15:07:32.968708 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:07:36 crc kubenswrapper[4740]: E0105 15:07:36.547834 4740 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 05 15:07:36 crc kubenswrapper[4740]: E0105 15:07:36.550241 4740 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kt95g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(efc1648a-7270-45c6-af93-bd4b641931d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 05 15:07:36 crc kubenswrapper[4740]: E0105 15:07:36.551729 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="efc1648a-7270-45c6-af93-bd4b641931d2" Jan 05 15:07:37 crc kubenswrapper[4740]: E0105 15:07:37.524316 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="efc1648a-7270-45c6-af93-bd4b641931d2" Jan 05 15:07:47 crc kubenswrapper[4740]: I0105 15:07:47.970146 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:07:47 crc kubenswrapper[4740]: E0105 15:07:47.971268 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:07:50 crc kubenswrapper[4740]: I0105 15:07:50.741630 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 05 15:07:52 crc kubenswrapper[4740]: I0105 15:07:52.716954 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"efc1648a-7270-45c6-af93-bd4b641931d2","Type":"ContainerStarted","Data":"43d04121a6ef67b06a60a9130b318784924d123d066d8260a4de16e86f97afea"} Jan 05 15:07:52 crc kubenswrapper[4740]: I0105 15:07:52.752915 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.453140091 podStartE2EDuration="1m7.752892717s" podCreationTimestamp="2026-01-05 15:06:45 +0000 UTC" firstStartedPulling="2026-01-05 15:06:48.438918157 +0000 UTC m=+4657.745826766" lastFinishedPulling="2026-01-05 15:07:50.738670813 +0000 UTC m=+4720.045579392" observedRunningTime="2026-01-05 15:07:52.741217541 +0000 UTC m=+4722.048126170" watchObservedRunningTime="2026-01-05 15:07:52.752892717 +0000 UTC m=+4722.059801316" Jan 05 15:08:02 crc kubenswrapper[4740]: I0105 15:08:02.968400 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:08:02 crc kubenswrapper[4740]: E0105 15:08:02.969146 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:08:14 crc kubenswrapper[4740]: I0105 15:08:14.968793 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:08:14 crc kubenswrapper[4740]: E0105 15:08:14.969557 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.338638 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tpdbp"] Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.343087 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.353328 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tpdbp"] Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.419772 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzp9m\" (UniqueName: \"kubernetes.io/projected/1767a202-36d4-4ff0-928b-6871274b165a-kube-api-access-kzp9m\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.420263 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-utilities\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.420491 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-catalog-content\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.523244 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-utilities\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.523330 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-catalog-content\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.523433 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzp9m\" (UniqueName: \"kubernetes.io/projected/1767a202-36d4-4ff0-928b-6871274b165a-kube-api-access-kzp9m\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.525687 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-utilities\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.525878 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-catalog-content\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.553994 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzp9m\" (UniqueName: \"kubernetes.io/projected/1767a202-36d4-4ff0-928b-6871274b165a-kube-api-access-kzp9m\") pod \"community-operators-tpdbp\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:22 crc kubenswrapper[4740]: I0105 15:08:22.669713 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:23 crc kubenswrapper[4740]: I0105 15:08:23.471086 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tpdbp"] Jan 05 15:08:23 crc kubenswrapper[4740]: W0105 15:08:23.509033 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1767a202_36d4_4ff0_928b_6871274b165a.slice/crio-53e6f8d0917a3fa0a1467bc047446f754c98efbc29e4ce5eb88e5cbda30a007c WatchSource:0}: Error finding container 53e6f8d0917a3fa0a1467bc047446f754c98efbc29e4ce5eb88e5cbda30a007c: Status 404 returned error can't find the container with id 53e6f8d0917a3fa0a1467bc047446f754c98efbc29e4ce5eb88e5cbda30a007c Jan 05 15:08:24 crc kubenswrapper[4740]: I0105 15:08:24.168815 4740 generic.go:334] "Generic (PLEG): container finished" podID="1767a202-36d4-4ff0-928b-6871274b165a" containerID="a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121" exitCode=0 Jan 05 15:08:24 crc kubenswrapper[4740]: I0105 15:08:24.169165 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerDied","Data":"a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121"} Jan 05 15:08:24 crc kubenswrapper[4740]: I0105 15:08:24.169204 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerStarted","Data":"53e6f8d0917a3fa0a1467bc047446f754c98efbc29e4ce5eb88e5cbda30a007c"} Jan 05 15:08:26 crc kubenswrapper[4740]: I0105 15:08:26.189549 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerStarted","Data":"d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb"} Jan 05 15:08:26 crc kubenswrapper[4740]: I0105 15:08:26.968307 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:08:26 crc kubenswrapper[4740]: E0105 15:08:26.968853 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:08:27 crc kubenswrapper[4740]: I0105 15:08:27.202039 4740 generic.go:334] "Generic (PLEG): container finished" podID="1767a202-36d4-4ff0-928b-6871274b165a" containerID="d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb" exitCode=0 Jan 05 15:08:27 crc kubenswrapper[4740]: I0105 15:08:27.202099 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerDied","Data":"d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb"} Jan 05 15:08:28 crc kubenswrapper[4740]: I0105 15:08:28.213675 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerStarted","Data":"e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba"} Jan 05 15:08:28 crc kubenswrapper[4740]: I0105 15:08:28.231877 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tpdbp" podStartSLOduration=2.646507532 podStartE2EDuration="6.231862782s" podCreationTimestamp="2026-01-05 15:08:22 +0000 UTC" firstStartedPulling="2026-01-05 15:08:24.172271945 +0000 UTC m=+4753.479180534" lastFinishedPulling="2026-01-05 15:08:27.757627205 +0000 UTC m=+4757.064535784" observedRunningTime="2026-01-05 15:08:28.230096625 +0000 UTC m=+4757.537005224" watchObservedRunningTime="2026-01-05 15:08:28.231862782 +0000 UTC m=+4757.538771361" Jan 05 15:08:32 crc kubenswrapper[4740]: I0105 15:08:32.670777 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:32 crc kubenswrapper[4740]: I0105 15:08:32.671412 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:33 crc kubenswrapper[4740]: I0105 15:08:33.743535 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tpdbp" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="registry-server" probeResult="failure" output=< Jan 05 15:08:33 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:08:33 crc kubenswrapper[4740]: > Jan 05 15:08:37 crc kubenswrapper[4740]: I0105 15:08:37.969843 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:08:37 crc kubenswrapper[4740]: E0105 15:08:37.970910 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:08:42 crc kubenswrapper[4740]: I0105 15:08:42.757928 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:42 crc kubenswrapper[4740]: I0105 15:08:42.824410 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:43 crc kubenswrapper[4740]: I0105 15:08:43.011967 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tpdbp"] Jan 05 15:08:44 crc kubenswrapper[4740]: I0105 15:08:44.419959 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tpdbp" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="registry-server" containerID="cri-o://e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba" gracePeriod=2 Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.234312 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.361567 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-utilities\") pod \"1767a202-36d4-4ff0-928b-6871274b165a\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.361671 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzp9m\" (UniqueName: \"kubernetes.io/projected/1767a202-36d4-4ff0-928b-6871274b165a-kube-api-access-kzp9m\") pod \"1767a202-36d4-4ff0-928b-6871274b165a\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.362144 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-catalog-content\") pod \"1767a202-36d4-4ff0-928b-6871274b165a\" (UID: \"1767a202-36d4-4ff0-928b-6871274b165a\") " Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.363443 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-utilities" (OuterVolumeSpecName: "utilities") pod "1767a202-36d4-4ff0-928b-6871274b165a" (UID: "1767a202-36d4-4ff0-928b-6871274b165a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.386889 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1767a202-36d4-4ff0-928b-6871274b165a-kube-api-access-kzp9m" (OuterVolumeSpecName: "kube-api-access-kzp9m") pod "1767a202-36d4-4ff0-928b-6871274b165a" (UID: "1767a202-36d4-4ff0-928b-6871274b165a"). InnerVolumeSpecName "kube-api-access-kzp9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.436358 4740 generic.go:334] "Generic (PLEG): container finished" podID="1767a202-36d4-4ff0-928b-6871274b165a" containerID="e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba" exitCode=0 Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.436405 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerDied","Data":"e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba"} Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.437375 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpdbp" event={"ID":"1767a202-36d4-4ff0-928b-6871274b165a","Type":"ContainerDied","Data":"53e6f8d0917a3fa0a1467bc047446f754c98efbc29e4ce5eb88e5cbda30a007c"} Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.437407 4740 scope.go:117] "RemoveContainer" containerID="e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.437640 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpdbp" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.449152 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1767a202-36d4-4ff0-928b-6871274b165a" (UID: "1767a202-36d4-4ff0-928b-6871274b165a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.464757 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.464801 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1767a202-36d4-4ff0-928b-6871274b165a-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.464818 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzp9m\" (UniqueName: \"kubernetes.io/projected/1767a202-36d4-4ff0-928b-6871274b165a-kube-api-access-kzp9m\") on node \"crc\" DevicePath \"\"" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.484616 4740 scope.go:117] "RemoveContainer" containerID="d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.513247 4740 scope.go:117] "RemoveContainer" containerID="a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.581866 4740 scope.go:117] "RemoveContainer" containerID="e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba" Jan 05 15:08:45 crc kubenswrapper[4740]: E0105 15:08:45.585151 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba\": container with ID starting with e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba not found: ID does not exist" containerID="e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.585480 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba"} err="failed to get container status \"e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba\": rpc error: code = NotFound desc = could not find container \"e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba\": container with ID starting with e25d44c723355c4456a7d50237e7417a8d436abf15837cc45a04c39705d17aba not found: ID does not exist" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.585508 4740 scope.go:117] "RemoveContainer" containerID="d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb" Jan 05 15:08:45 crc kubenswrapper[4740]: E0105 15:08:45.585952 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb\": container with ID starting with d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb not found: ID does not exist" containerID="d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.585980 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb"} err="failed to get container status \"d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb\": rpc error: code = NotFound desc = could not find container \"d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb\": container with ID starting with d846495d02e55b69754d5b699b942c3798bd6477994f83a1513dc2a7bb0c8edb not found: ID does not exist" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.586001 4740 scope.go:117] "RemoveContainer" containerID="a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121" Jan 05 15:08:45 crc kubenswrapper[4740]: E0105 15:08:45.586517 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121\": container with ID starting with a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121 not found: ID does not exist" containerID="a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.586546 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121"} err="failed to get container status \"a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121\": rpc error: code = NotFound desc = could not find container \"a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121\": container with ID starting with a87910b2c5554d61f19443d65f64f399974e4c34287c526aeb03ff330a5af121 not found: ID does not exist" Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.774499 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tpdbp"] Jan 05 15:08:45 crc kubenswrapper[4740]: I0105 15:08:45.788945 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tpdbp"] Jan 05 15:08:46 crc kubenswrapper[4740]: I0105 15:08:46.984527 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1767a202-36d4-4ff0-928b-6871274b165a" path="/var/lib/kubelet/pods/1767a202-36d4-4ff0-928b-6871274b165a/volumes" Jan 05 15:08:48 crc kubenswrapper[4740]: I0105 15:08:48.968910 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:08:48 crc kubenswrapper[4740]: E0105 15:08:48.969949 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:09:00 crc kubenswrapper[4740]: I0105 15:09:00.739522 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:09:00 crc kubenswrapper[4740]: I0105 15:09:00.739513 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:09:03 crc kubenswrapper[4740]: I0105 15:09:03.979160 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:09:03 crc kubenswrapper[4740]: E0105 15:09:03.985388 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:09:15 crc kubenswrapper[4740]: I0105 15:09:15.969571 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:09:15 crc kubenswrapper[4740]: E0105 15:09:15.971351 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:09:29 crc kubenswrapper[4740]: I0105 15:09:29.969992 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:09:29 crc kubenswrapper[4740]: E0105 15:09:29.971274 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:09:44 crc kubenswrapper[4740]: I0105 15:09:44.972719 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:09:44 crc kubenswrapper[4740]: E0105 15:09:44.975492 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:09:57 crc kubenswrapper[4740]: I0105 15:09:57.972795 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:09:57 crc kubenswrapper[4740]: E0105 15:09:57.977474 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:10:08 crc kubenswrapper[4740]: I0105 15:10:08.971347 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:10:08 crc kubenswrapper[4740]: E0105 15:10:08.974388 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:10:12 crc kubenswrapper[4740]: I0105 15:10:12.385121 4740 patch_prober.go:28] interesting pod/thanos-querier-78985bc954-b6gsd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:12 crc kubenswrapper[4740]: I0105 15:10:12.386037 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" podUID="3c267f0a-b0bb-43fe-9a21-92472096a632" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.094727 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.094723 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.098120 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.098167 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.687235 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" podUID="01f58b56-275e-432c-aecc-f9853194f0fd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.687263 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-hfcj9" podUID="01f58b56-275e-432c-aecc-f9853194f0fd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.689869 4740 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.689925 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.822417 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.822488 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.822514 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.822564 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.909187 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:17 crc kubenswrapper[4740]: I0105 15:10:17.909252 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.098868 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.098941 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.136153 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.136219 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.238943 4740 patch_prober.go:28] interesting pod/nmstate-webhook-f8fb84555-zvv7k container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.93:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.239000 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" podUID="e6df01b0-f4c2-49c2-982d-4b814fd5d493" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.93:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.398233 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" podUID="ab509865-4e08-4927-b702-f28bfb553a27" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.398328 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" podUID="ab509865-4e08-4927-b702-f28bfb553a27" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.594667 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.594801 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.595862 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.595917 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.734552 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.735311 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.880602 4740 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.881039 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.939852 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:18 crc kubenswrapper[4740]: I0105 15:10:18.940006 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.231292 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.231331 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.338772 4740 trace.go:236] Trace[1900725913]: "Calculate volume metrics of wal for pod openshift-logging/logging-loki-ingester-0" (05-Jan-2026 15:10:15.660) (total time: 3677ms): Jan 05 15:10:19 crc kubenswrapper[4740]: Trace[1900725913]: [3.677305801s] [3.677305801s] END Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.594559 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.594624 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.733408 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:19 crc kubenswrapper[4740]: I0105 15:10:19.733596 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.241390 4740 trace.go:236] Trace[1869444818]: "Calculate volume metrics of glance for pod openstack/glance-default-internal-api-0" (05-Jan-2026 15:10:19.219) (total time: 1021ms): Jan 05 15:10:20 crc kubenswrapper[4740]: Trace[1869444818]: [1.021974608s] [1.021974608s] END Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.578241 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" podUID="49bbef73-8653-4747-93ee-35819a394b1f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.129:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.578501 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.578956 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.578951 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.579023 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.579056 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.579108 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.579128 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.579156 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.579167 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.699297 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.699277 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.732474 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.732874 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:20 crc kubenswrapper[4740]: I0105 15:10:20.737925 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="2b922e17-0ca9-49cd-8af7-b78776b990bb" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 05 15:10:21 crc kubenswrapper[4740]: I0105 15:10:21.447397 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:21 crc kubenswrapper[4740]: timeout: health rpc did not complete within 1s Jan 05 15:10:21 crc kubenswrapper[4740]: > Jan 05 15:10:21 crc kubenswrapper[4740]: I0105 15:10:21.447913 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:21 crc kubenswrapper[4740]: timeout: health rpc did not complete within 1s Jan 05 15:10:21 crc kubenswrapper[4740]: > Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.088309 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-vp8lm" podUID="ba42b606-5ea8-4d51-a695-dc563937f304" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.088576 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-vp8lm" podUID="ba42b606-5ea8-4d51-a695-dc563937f304" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.170309 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mrdpz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.170407 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" podUID="22d5567a-2314-42aa-b197-dac963dcbfd1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.170484 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mrdpz container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.170537 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" podUID="22d5567a-2314-42aa-b197-dac963dcbfd1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.369134 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.369194 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.369204 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.369281 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.383955 4740 patch_prober.go:28] interesting pod/thanos-querier-78985bc954-b6gsd container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.86:9091/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.383994 4740 patch_prober.go:28] interesting pod/thanos-querier-78985bc954-b6gsd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.384030 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" podUID="3c267f0a-b0bb-43fe-9a21-92472096a632" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.384049 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" podUID="3c267f0a-b0bb-43fe-9a21-92472096a632" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.631266 4740 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-mn8lv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:22 crc kubenswrapper[4740]: I0105 15:10:22.631541 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" podUID="fb20baf5-f4d5-4234-9552-f4d73c447fcc" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.594246 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.594560 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.732552 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.732851 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.732923 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.734253 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.939169 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.939252 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.954318 4740 patch_prober.go:28] interesting pod/console-78554f9b97-scjhq container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.954376 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78554f9b97-scjhq" podUID="322864a4-da88-47f5-9d44-89a38dd2d8f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:23 crc kubenswrapper[4740]: I0105 15:10:23.971094 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:10:23 crc kubenswrapper[4740]: E0105 15:10:23.972908 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:10:25 crc kubenswrapper[4740]: I0105 15:10:25.175282 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:25 crc kubenswrapper[4740]: I0105 15:10:25.175564 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:25 crc kubenswrapper[4740]: I0105 15:10:25.175289 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:25 crc kubenswrapper[4740]: I0105 15:10:25.175658 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:25 crc kubenswrapper[4740]: I0105 15:10:25.673857 4740 patch_prober.go:28] interesting pod/metrics-server-85559f775f-nmxz8 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:25 crc kubenswrapper[4740]: I0105 15:10:25.673915 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" podUID="8e7d78e7-6855-46c1-a51c-7f4127c80b7d" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:26 crc kubenswrapper[4740]: I0105 15:10:26.068930 4740 patch_prober.go:28] interesting pod/monitoring-plugin-6fcb8d88f7-lcqs4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:26 crc kubenswrapper[4740]: I0105 15:10:26.069021 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" podUID="c85eb61e-dde6-42e7-b3d9-82837d0104d7" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:26 crc kubenswrapper[4740]: I0105 15:10:26.738834 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="2b922e17-0ca9-49cd-8af7-b78776b990bb" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.093731 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.093832 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.094143 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.094220 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.722423 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podUID="d8ffad98-ed22-4c4c-b0b8-234c3358089e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.821645 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.821706 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.822078 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.822097 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.903340 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podUID="3868391b-95fe-40be-a77d-593ea72fd786" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.908994 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:27 crc kubenswrapper[4740]: I0105 15:10:27.909081 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.099771 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.099859 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.122323 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.122379 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podUID="d4245835-8bf3-4491-9e66-a456d2fea83d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.135994 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.136046 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.205212 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.205322 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" podUID="ca259c15-4c6d-4142-b257-12e805385d3f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.357262 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-9b6f8f78c-6wdrm" podUID="ab509865-4e08-4927-b702-f28bfb553a27" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.544314 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.544709 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.544340 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.544839 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.593860 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.593922 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.723260 4740 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-lpjfp container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.723316 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.732977 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.733239 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ovn-northd-0" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.733259 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.733310 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.733396 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.733616 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.734966 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ovn-northd" containerStatusID={"Type":"cri-o","ID":"51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4"} pod="openstack/ovn-northd-0" containerMessage="Container ovn-northd failed liveness probe, will be restarted" Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.735123 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" containerID="cri-o://51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" gracePeriod=30 Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.941291 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:28 crc kubenswrapper[4740]: I0105 15:10:28.941364 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.247295 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.247306 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.734088 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.739489 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.743772 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.873377 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" podUID="6faf05ee-49e0-4d3e-afcd-d11d9494da44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:29 crc kubenswrapper[4740]: I0105 15:10:29.873583 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" podUID="6faf05ee-49e0-4d3e-afcd-d11d9494da44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.495320 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.496386 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.496448 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.496472 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.537034 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:30 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:30 crc kubenswrapper[4740]: > Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.540486 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:30 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:30 crc kubenswrapper[4740]: > Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.561788 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.561845 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.562280 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.562304 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.699321 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.699391 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.732328 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:30 crc kubenswrapper[4740]: I0105 15:10:30.732912 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:31 crc kubenswrapper[4740]: I0105 15:10:31.736834 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="2b922e17-0ca9-49cd-8af7-b78776b990bb" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 05 15:10:31 crc kubenswrapper[4740]: I0105 15:10:31.737430 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Jan 05 15:10:31 crc kubenswrapper[4740]: I0105 15:10:31.750111 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"81e8ba9d6865673f9d753a4677be70994eece7af3846dfc661ceebd4675c4a19"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Jan 05 15:10:31 crc kubenswrapper[4740]: I0105 15:10:31.750231 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b922e17-0ca9-49cd-8af7-b78776b990bb" containerName="ceilometer-central-agent" containerID="cri-o://81e8ba9d6865673f9d753a4677be70994eece7af3846dfc661ceebd4675c4a19" gracePeriod=30 Jan 05 15:10:32 crc kubenswrapper[4740]: I0105 15:10:32.369040 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:32 crc kubenswrapper[4740]: I0105 15:10:32.369087 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:32 crc kubenswrapper[4740]: I0105 15:10:32.369135 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:32 crc kubenswrapper[4740]: I0105 15:10:32.369146 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:32 crc kubenswrapper[4740]: E0105 15:10:32.480161 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 05 15:10:32 crc kubenswrapper[4740]: E0105 15:10:32.481728 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 05 15:10:32 crc kubenswrapper[4740]: E0105 15:10:32.483948 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 05 15:10:32 crc kubenswrapper[4740]: E0105 15:10:32.484030 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" Jan 05 15:10:32 crc kubenswrapper[4740]: I0105 15:10:32.632098 4740 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-mn8lv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:32 crc kubenswrapper[4740]: I0105 15:10:32.632159 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" podUID="fb20baf5-f4d5-4234-9552-f4d73c447fcc" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.593822 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.594109 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.595259 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.595304 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.940687 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.940960 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.940817 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.941099 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.955733 4740 patch_prober.go:28] interesting pod/console-78554f9b97-scjhq container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:33 crc kubenswrapper[4740]: I0105 15:10:33.955790 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78554f9b97-scjhq" podUID="322864a4-da88-47f5-9d44-89a38dd2d8f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.634743 4740 patch_prober.go:28] interesting pod/oauth-openshift-68b95d957b-fwtrs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.635103 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.634848 4740 patch_prober.go:28] interesting pod/oauth-openshift-68b95d957b-fwtrs container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.635189 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.737813 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="2b922e17-0ca9-49cd-8af7-b78776b990bb" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.808684 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf526b0a-998c-4943-bfba-04352421ed58/ovn-northd/0.log" Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.808877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf526b0a-998c-4943-bfba-04352421ed58","Type":"ContainerDied","Data":"51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4"} Jan 05 15:10:34 crc kubenswrapper[4740]: I0105 15:10:34.809172 4740 generic.go:334] "Generic (PLEG): container finished" podID="bf526b0a-998c-4943-bfba-04352421ed58" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" exitCode=139 Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217302 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217634 4740 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhb46 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217667 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217696 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" podUID="8255876c-990a-4658-8d74-66b4d45e379c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217417 4740 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhb46 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217443 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217773 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" podUID="8255876c-990a-4658-8d74-66b4d45e379c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.217801 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.238960 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="6d1b6a83-0692-4de9-8d5f-56f4371b9d22" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.11:8080/livez\": context deadline exceeded" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.238997 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="6d1b6a83-0692-4de9-8d5f-56f4371b9d22" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.11:8081/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.242612 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-mkqxp" podUID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.242612 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mkqxp" podUID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.248944 4740 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzhlt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.248992 4740 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzhlt container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.249021 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" podUID="b43916b9-e413-4a80-880a-3feee7227ec5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.248989 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" podUID="b43916b9-e413-4a80-880a-3feee7227ec5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.280254 4740 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvzzm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.280336 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" podUID="eb24541e-e924-4085-b91e-e2d5a0bc8349" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.280254 4740 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvzzm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.280432 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" podUID="eb24541e-e924-4085-b91e-e2d5a0bc8349" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.370435 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-dwcfs" podUID="b023e432-3b4e-4161-bfcc-b5d8b601e9d5" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.493432 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-tj7tn" podUID="4df26838-83be-4000-b37e-841a0457717b" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.496969 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-tj7tn" podUID="4df26838-83be-4000-b37e-841a0457717b" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.672798 4740 patch_prober.go:28] interesting pod/metrics-server-85559f775f-nmxz8 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.672852 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" podUID="8e7d78e7-6855-46c1-a51c-7f4127c80b7d" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.674232 4740 patch_prober.go:28] interesting pod/metrics-server-85559f775f-nmxz8 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.674316 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" podUID="8e7d78e7-6855-46c1-a51c-7f4127c80b7d" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.696389 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-bgrks" podUID="6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.942871 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-dwcfs" podUID="b023e432-3b4e-4161-bfcc-b5d8b601e9d5" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.943390 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-8q5ns" podUID="4e38ea3c-6147-4049-a885-c9a247a5697c" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.944828 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-dgjkq" podUID="02c01131-569c-43e4-b848-8d4b49a383d4" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.947364 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-bgrks" podUID="6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.947783 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-8q5ns" podUID="4e38ea3c-6147-4049-a885-c9a247a5697c" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:35 crc kubenswrapper[4740]: I0105 15:10:35.947965 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-dgjkq" podUID="02c01131-569c-43e4-b848-8d4b49a383d4" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:35 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:35 crc kubenswrapper[4740]: > Jan 05 15:10:36 crc kubenswrapper[4740]: I0105 15:10:36.068512 4740 patch_prober.go:28] interesting pod/monitoring-plugin-6fcb8d88f7-lcqs4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:36 crc kubenswrapper[4740]: I0105 15:10:36.068896 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" podUID="c85eb61e-dde6-42e7-b3d9-82837d0104d7" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.010222 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.010260 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.010273 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.010312 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.010304 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.011481 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"fe10947fde16ca0dc4ddddad37e7043c438752dd275cb3f058bb1d9132ca7eaf"} pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.011678 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" containerID="cri-o://fe10947fde16ca0dc4ddddad37e7043c438752dd275cb3f058bb1d9132ca7eaf" gracePeriod=30 Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.051847 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="a87008bf-295c-4343-a6b2-f3fd37fa581d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.168:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.052174 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="a87008bf-295c-4343-a6b2-f3fd37fa581d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.168:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.384054 4740 patch_prober.go:28] interesting pod/thanos-querier-78985bc954-b6gsd container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.384163 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-78985bc954-b6gsd" podUID="3c267f0a-b0bb-43fe-9a21-92472096a632" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.86:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: E0105 15:10:37.478747 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4 is running failed: container process not found" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 05 15:10:37 crc kubenswrapper[4740]: E0105 15:10:37.479558 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4 is running failed: container process not found" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 05 15:10:37 crc kubenswrapper[4740]: E0105 15:10:37.479959 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4 is running failed: container process not found" containerID="51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 05 15:10:37 crc kubenswrapper[4740]: E0105 15:10:37.480031 4740 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 51aab52be8560eabd07cc70785842655eb3d8925eea6f91beea05489f09210b4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.654308 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" podUID="a07332cc-11af-4d3a-8761-891417586bd1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.654444 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-5dgxw" podUID="a07332cc-11af-4d3a-8761-891417586bd1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.762318 4740 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.762386 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.762610 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podUID="d8ffad98-ed22-4c4c-b0b8-234c3358089e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.762591 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podUID="d8ffad98-ed22-4c4c-b0b8-234c3358089e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.940824 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.940925 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.941528 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" podUID="97dce6b2-fc01-4ced-a77a-a506dcb06eff" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.941600 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.941629 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.941705 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.942456 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" podUID="e2dc84c3-c204-4f17-bcf3-418ab17b873d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.947636 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"642b6cdffc94c05cafdd62614d483b3daf241708b521d3a8f422c714c83ea1c7"} pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" containerMessage="Container controller-manager failed liveness probe, will be restarted" Jan 05 15:10:37 crc kubenswrapper[4740]: I0105 15:10:37.947729 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" containerID="cri-o://642b6cdffc94c05cafdd62614d483b3daf241708b521d3a8f422c714c83ea1c7" gracePeriod=30 Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107272 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podUID="3868391b-95fe-40be-a77d-593ea72fd786" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107376 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-6d46c7d5f9-sm4zx" podUID="97dce6b2-fc01-4ced-a77a-a506dcb06eff" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107273 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" podUID="e2dc84c3-c204-4f17-bcf3-418ab17b873d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107514 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107551 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107598 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" podUID="61306334-5c80-4b48-8c47-bbc9a26f5ef3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.107648 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-skl95" podUID="61306334-5c80-4b48-8c47-bbc9a26f5ef3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.108249 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.192270 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podUID="3868391b-95fe-40be-a77d-593ea72fd786" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.192382 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.357237 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.357550 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.357627 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.357289 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" podUID="ca259c15-4c6d-4142-b257-12e805385d3f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.524271 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.524274 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.524362 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.524380 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.524460 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.594602 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.594912 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606304 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" podUID="da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606310 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podUID="d4245835-8bf3-4491-9e66-a456d2fea83d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606328 4740 patch_prober.go:28] interesting pod/nmstate-webhook-f8fb84555-zvv7k container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.93:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606589 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-f8fb84555-zvv7k" podUID="e6df01b0-f4c2-49c2-982d-4b814fd5d493" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.93:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606619 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podUID="d4245835-8bf3-4491-9e66-a456d2fea83d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606664 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" podUID="ca259c15-4c6d-4142-b257-12e805385d3f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606715 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.606744 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.688308 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" podUID="d1228a5b-52ed-4d7e-940e-b4b03288fae5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.688309 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" podUID="35577830-7016-49ad-bae0-8a9962a2e82c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.748898 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-58hgk" podUID="a4bfce1d-9af3-49f5-877e-b5ea29088ac7" containerName="nmstate-handler" probeResult="failure" output="command timed out" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.853223 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" podUID="35577830-7016-49ad-bae0-8a9962a2e82c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.853298 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" podUID="c5e3ed99-183e-41f6-bbee-d5c8e7f629d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.935439 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.935468 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.935573 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.935623 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" podUID="da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.935665 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-8qngq" podUID="71602a38-6096-4224-b49a-adfccfe02180" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.935701 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" podUID="d1228a5b-52ed-4d7e-940e-b4b03288fae5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.941657 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.941724 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.941729 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.941777 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:38 crc kubenswrapper[4740]: I0105 15:10:38.969948 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018540 4740 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-lpjfp container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018572 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" podUID="c7f964a1-5e19-4cb6-8e25-26fdc09410af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018610 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018613 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" podUID="c7f964a1-5e19-4cb6-8e25-26fdc09410af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018697 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-jbqnp" podUID="c5e3ed99-183e-41f6-bbee-d5c8e7f629d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018765 4740 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018786 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018829 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018846 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018889 4740 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-lpjfp container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.12:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.018923 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.12:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.109575 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.109633 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.150715 4740 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.151144 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.55:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246248 4740 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246266 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246309 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.56:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246363 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246365 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246397 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246421 4740 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.246452 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="572c5991-5ab4-4786-9641-cc8f3ff4bd21" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.247659 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"02dba798ce836469378a30015bc83abc00249bee63a8a9c082f291b3dda415a1"} pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" containerMessage="Container webhook-server failed liveness probe, will be restarted" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.247722 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" containerID="cri-o://02dba798ce836469378a30015bc83abc00249bee63a8a9c082f291b3dda415a1" gracePeriod=2 Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.287271 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-65b6fb4bb9-pfr54" podUID="8de8b15c-984a-4c5f-a956-5f4244da97ef" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.358938 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.359260 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.525077 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.525139 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.733811 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.734256 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.735829 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.736374 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.736450 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.744647 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.744651 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.832280 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" podUID="6faf05ee-49e0-4d3e-afcd-d11d9494da44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:39 crc kubenswrapper[4740]: I0105 15:10:39.892365 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" podUID="3f0a6bbe-32b3-4e4d-afef-32e871616c6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.287372 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.619259 4740 patch_prober.go:28] interesting pod/loki-operator-controller-manager-56d45b676b-q44gh container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.48:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.619310 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" podUID="50147f9c-3a52-4e0e-b0cc-1fd94e7def10" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.702222 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.702370 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.702659 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.702357 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.702726 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ffh7k" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.702812 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.704634 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr-k8s-webhook-server" containerStatusID={"Type":"cri-o","ID":"49f06f29e33c5e348a2da149a499c53a1185432ff82fc3000fe402e4826d6fb3"} pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" containerMessage="Container frr-k8s-webhook-server failed liveness probe, will be restarted" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.704715 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" containerID="cri-o://49f06f29e33c5e348a2da149a499c53a1185432ff82fc3000fe402e4826d6fb3" gracePeriod=10 Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784283 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-5bddd4b946-pgtqj" podUID="22a24e17-6432-4f77-a553-47f9de4d68e4" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.103:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784386 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784403 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784542 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784606 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784673 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784437 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784809 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-ffh7k" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784865 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.784960 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.786231 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"803251059ac78863fda7261eae0f16876d61fed9009de0e860465cb2a83226e6"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.786271 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" containerID="cri-o://803251059ac78863fda7261eae0f16876d61fed9009de0e860465cb2a83226e6" gracePeriod=30 Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.866454 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.866507 4740 patch_prober.go:28] interesting pod/loki-operator-controller-manager-56d45b676b-q44gh container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.866578 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" podUID="50147f9c-3a52-4e0e-b0cc-1fd94e7def10" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.866531 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.866823 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" podUID="49bbef73-8653-4747-93ee-35819a394b1f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.129:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.866915 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-5bddd4b946-pgtqj" podUID="22a24e17-6432-4f77-a553-47f9de4d68e4" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.103:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.867032 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.867833 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cert-manager-webhook" containerStatusID={"Type":"cri-o","ID":"a81121b4cafa888982fe80a89360b5a2ef5f338e426058529b1b00dfba5ed46d"} pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" containerMessage="Container cert-manager-webhook failed liveness probe, will be restarted" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.867905 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" containerID="cri-o://a81121b4cafa888982fe80a89360b5a2ef5f338e426058529b1b00dfba5ed46d" gracePeriod=30 Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.878491 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf526b0a-998c-4943-bfba-04352421ed58/ovn-northd/0.log" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.879278 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf526b0a-998c-4943-bfba-04352421ed58","Type":"ContainerStarted","Data":"0087e7c1f3b90121aa561fd1707d4261e05ebf90aba84e8c8d64244759ba8328"} Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.879389 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.881916 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"5b8a6b87265d54ac4ffe376ba3cafb17fa6677039615a1c5d95bd77a60f25bca"} Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.882956 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"e9cbc5b2b76dac75a477a645cfdc20161bb20a2f6ae0cf42bfb62042f7a39ab9"} pod="metallb-system/frr-k8s-ffh7k" containerMessage="Container frr failed liveness probe, will be restarted" Jan 05 15:10:40 crc kubenswrapper[4740]: I0105 15:10:40.883111 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" containerID="cri-o://e9cbc5b2b76dac75a477a645cfdc20161bb20a2f6ae0cf42bfb62042f7a39ab9" gracePeriod=2 Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.574302 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" podUID="3661c83f-42d5-4441-95c9-bc94757cc85d" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.38:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.643395 4740 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-lkk22 container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.6:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.643472 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" podUID="a03ac734-fdb2-4390-a5dc-1aed999390b4" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.643597 4740 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-lkk22 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.643655 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lkk22" podUID="a03ac734-fdb2-4390-a5dc-1aed999390b4" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.735201 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.735347 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.735723 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.735889 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.740339 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.785631 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:41 crc kubenswrapper[4740]: I0105 15:10:41.785645 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.086223 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-vp8lm" podUID="ba42b606-5ea8-4d51-a695-dc563937f304" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.086254 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-vp8lm" podUID="ba42b606-5ea8-4d51-a695-dc563937f304" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.168358 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mrdpz container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.168436 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" podUID="22d5567a-2314-42aa-b197-dac963dcbfd1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.168505 4740 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mrdpz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.168520 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mrdpz" podUID="22d5567a-2314-42aa-b197-dac963dcbfd1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.61:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.203232 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerDied","Data":"e9cbc5b2b76dac75a477a645cfdc20161bb20a2f6ae0cf42bfb62042f7a39ab9"} Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.203497 4740 generic.go:334] "Generic (PLEG): container finished" podID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerID="e9cbc5b2b76dac75a477a645cfdc20161bb20a2f6ae0cf42bfb62042f7a39ab9" exitCode=143 Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.368149 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.368411 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.368208 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.368494 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.368724 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.368753 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.369733 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"1cb8ce148fa643ebf57fc324bdbfab16b67bc36f03b57a86e7eb031a2cfc9617"} pod="openshift-console-operator/console-operator-58897d9998-gwgzt" containerMessage="Container console-operator failed liveness probe, will be restarted" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.369766 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" containerID="cri-o://1cb8ce148fa643ebf57fc324bdbfab16b67bc36f03b57a86e7eb031a2cfc9617" gracePeriod=30 Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.629926 4740 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-mn8lv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.629989 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" podUID="fb20baf5-f4d5-4234-9552-f4d73c447fcc" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.630047 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.641099 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"a53d6d483e52d8db7e3b594ef4b0aeebecc626bd228672ccefde05069df831ca"} pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.641163 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" podUID="fb20baf5-f4d5-4234-9552-f4d73c447fcc" containerName="authentication-operator" containerID="cri-o://a53d6d483e52d8db7e3b594ef4b0aeebecc626bd228672ccefde05069df831ca" gracePeriod=30 Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.733235 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-tjvv9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.733262 4740 patch_prober.go:28] interesting pod/downloads-7954f5f757-tjvv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.733289 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-tjvv9" podUID="f6a82703-5aea-4d1a-a8fa-3e4393a1176b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.733313 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tjvv9" podUID="f6a82703-5aea-4d1a-a8fa-3e4393a1176b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.733434 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.745685 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-7vp5p" podUID="836c7750-5680-4a56-8947-2df3b121bb3f" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.746201 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-7vp5p" podUID="836c7750-5680-4a56-8947-2df3b121bb3f" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.828305 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:42 crc kubenswrapper[4740]: I0105 15:10:42.951762 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pcbgg" podUID="3661c83f-42d5-4441-95c9-bc94757cc85d" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.218282 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ffh7k" event={"ID":"3c26b9da-cc4e-44dc-92ba-92e42b962010","Type":"ContainerStarted","Data":"125109b41bfffb572a7739a319c2a19dbafeb3fcb95b936f08c35d737be9a24a"} Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.220807 4740 generic.go:334] "Generic (PLEG): container finished" podID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerID="02dba798ce836469378a30015bc83abc00249bee63a8a9c082f291b3dda415a1" exitCode=137 Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.220871 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" event={"ID":"939f9254-d7a5-48cb-8ab1-7ea2e4f68610","Type":"ContainerDied","Data":"02dba798ce836469378a30015bc83abc00249bee63a8a9c082f291b3dda415a1"} Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.220910 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" event={"ID":"939f9254-d7a5-48cb-8ab1-7ea2e4f68610","Type":"ContainerStarted","Data":"bf2180134d02171fad1f34b50edcf803dd9be01db12d9bdc5893de526bad880a"} Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.221345 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": dial tcp 10.217.0.101:7472: connect: connection refused" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.369603 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.369952 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.430504 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ffh7k" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.593811 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.593897 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.594096 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.594178 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.737280 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-lfp42" podUID="f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.737280 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-lfp42" podUID="f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.738239 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.738322 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.738940 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.739007 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.739184 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e"} pod="openstack-operators/openstack-operator-index-xlq6v" containerMessage="Container registry-server failed liveness probe, will be restarted" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.739268 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" containerID="cri-o://d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e" gracePeriod=30 Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.833072 4740 trace.go:236] Trace[756907285]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (05-Jan-2026 15:10:34.029) (total time: 9802ms): Jan 05 15:10:43 crc kubenswrapper[4740]: Trace[756907285]: [9.802645854s] [9.802645854s] END Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.839568 4740 trace.go:236] Trace[1955333951]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-1" (05-Jan-2026 15:10:39.494) (total time: 4345ms): Jan 05 15:10:43 crc kubenswrapper[4740]: Trace[1955333951]: [4.34537273s] [4.34537273s] END Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.939347 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.939408 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.54:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.945479 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.945530 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.956234 4740 patch_prober.go:28] interesting pod/console-78554f9b97-scjhq container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.956332 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78554f9b97-scjhq" podUID="322864a4-da88-47f5-9d44-89a38dd2d8f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:43 crc kubenswrapper[4740]: I0105 15:10:43.956439 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.250643 4740 generic.go:334] "Generic (PLEG): container finished" podID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerID="a81121b4cafa888982fe80a89360b5a2ef5f338e426058529b1b00dfba5ed46d" exitCode=0 Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.250852 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" event={"ID":"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22","Type":"ContainerDied","Data":"a81121b4cafa888982fe80a89360b5a2ef5f338e426058529b1b00dfba5ed46d"} Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.253449 4740 generic.go:334] "Generic (PLEG): container finished" podID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerID="49f06f29e33c5e348a2da149a499c53a1185432ff82fc3000fe402e4826d6fb3" exitCode=0 Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.253515 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" event={"ID":"afd48a06-614a-4fed-8629-a1a2eb83ab80","Type":"ContainerDied","Data":"49f06f29e33c5e348a2da149a499c53a1185432ff82fc3000fe402e4826d6fb3"} Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.256392 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-gwgzt_b8f9d516-9c35-42b4-a4d5-e6d189053b5e/console-operator/0.log" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.256463 4740 generic.go:334] "Generic (PLEG): container finished" podID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerID="1cb8ce148fa643ebf57fc324bdbfab16b67bc36f03b57a86e7eb031a2cfc9617" exitCode=1 Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.256559 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" event={"ID":"b8f9d516-9c35-42b4-a4d5-e6d189053b5e","Type":"ContainerDied","Data":"1cb8ce148fa643ebf57fc324bdbfab16b67bc36f03b57a86e7eb031a2cfc9617"} Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.256627 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.378812 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ffh7k" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.467574 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.617741 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": dial tcp 10.217.0.44:6080: connect: connection refused" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.634952 4740 patch_prober.go:28] interesting pod/oauth-openshift-68b95d957b-fwtrs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.635034 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.634975 4740 patch_prober.go:28] interesting pod/oauth-openshift-68b95d957b-fwtrs container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.635111 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-68b95d957b-fwtrs" podUID="4e0355b5-87f3-4eeb-b13f-8287d9cb0786" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.734258 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.734346 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="af28d147-ae7a-4827-8d93-34cc6e211e3b" containerName="prometheus" probeResult="failure" output="command timed out" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.737595 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-g9bwh" podUID="ceb39900-d5f8-4d29-b3ec-01a60b2e4378" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.740501 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-g9bwh" podUID="ceb39900-d5f8-4d29-b3ec-01a60b2e4378" containerName="registry-server" probeResult="failure" output="command timed out" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.752988 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mkqxp" podUID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:44 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:44 crc kubenswrapper[4740]: > Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.770391 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-mkqxp" podUID="dc07eab5-3d3e-4da1-aff1-dc180039a90a" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:44 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:44 crc kubenswrapper[4740]: > Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.983303 4740 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-46x82 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.983631 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" podUID="89ec45a7-04a1-4913-b5e0-9ebc1d04f46c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.983319 4740 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-46x82 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.983692 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-46x82" podUID="89ec45a7-04a1-4913-b5e0-9ebc1d04f46c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.983398 4740 patch_prober.go:28] interesting pod/console-78554f9b97-scjhq container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:44 crc kubenswrapper[4740]: I0105 15:10:44.983729 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78554f9b97-scjhq" podUID="322864a4-da88-47f5-9d44-89a38dd2d8f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217252 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217333 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217355 4740 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhb46 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217400 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" podUID="8255876c-990a-4658-8d74-66b4d45e379c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217425 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217441 4740 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qhb46 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217462 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qhb46" podUID="8255876c-990a-4658-8d74-66b4d45e379c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217503 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217568 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.217641 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.218531 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"14d1a0dd9b47a01c1c3209c3aef94dfbcb5c477c7bd10a123ffa701476b8da1d"} pod="openshift-ingress/router-default-5444994796-f8vfq" containerMessage="Container router failed liveness probe, will be restarted" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.218571 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" containerID="cri-o://14d1a0dd9b47a01c1c3209c3aef94dfbcb5c477c7bd10a123ffa701476b8da1d" gracePeriod=10 Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.228679 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-8q5ns" podUID="4e38ea3c-6147-4049-a885-c9a247a5697c" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.231159 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-8q5ns" podUID="4e38ea3c-6147-4049-a885-c9a247a5697c" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.239857 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-dgjkq" podUID="02c01131-569c-43e4-b848-8d4b49a383d4" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.245545 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-dgjkq" podUID="02c01131-569c-43e4-b848-8d4b49a383d4" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.245642 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="6d1b6a83-0692-4de9-8d5f-56f4371b9d22" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.11:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.249327 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="6d1b6a83-0692-4de9-8d5f-56f4371b9d22" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.11:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.249373 4740 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzhlt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.249419 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" podUID="b43916b9-e413-4a80-880a-3feee7227ec5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.249477 4740 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzhlt container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.249491 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzhlt" podUID="b43916b9-e413-4a80-880a-3feee7227ec5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.273607 4740 generic.go:334] "Generic (PLEG): container finished" podID="2b922e17-0ca9-49cd-8af7-b78776b990bb" containerID="81e8ba9d6865673f9d753a4677be70994eece7af3846dfc661ceebd4675c4a19" exitCode=0 Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.273681 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerDied","Data":"81e8ba9d6865673f9d753a4677be70994eece7af3846dfc661ceebd4675c4a19"} Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.276591 4740 generic.go:334] "Generic (PLEG): container finished" podID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerID="803251059ac78863fda7261eae0f16876d61fed9009de0e860465cb2a83226e6" exitCode=0 Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.276653 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" event={"ID":"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8","Type":"ContainerDied","Data":"803251059ac78863fda7261eae0f16876d61fed9009de0e860465cb2a83226e6"} Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.279751 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" event={"ID":"fb20baf5-f4d5-4234-9552-f4d73c447fcc","Type":"ContainerDied","Data":"a53d6d483e52d8db7e3b594ef4b0aeebecc626bd228672ccefde05069df831ca"} Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.279696 4740 generic.go:334] "Generic (PLEG): container finished" podID="fb20baf5-f4d5-4234-9552-f4d73c447fcc" containerID="a53d6d483e52d8db7e3b594ef4b0aeebecc626bd228672ccefde05069df831ca" exitCode=0 Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.280040 4740 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvzzm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.280091 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" podUID="eb24541e-e924-4085-b91e-e2d5a0bc8349" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.280130 4740 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xvzzm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": context deadline exceeded" start-of-body= Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.280145 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xvzzm" podUID="eb24541e-e924-4085-b91e-e2d5a0bc8349" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": context deadline exceeded" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.282709 4740 generic.go:334] "Generic (PLEG): container finished" podID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerID="642b6cdffc94c05cafdd62614d483b3daf241708b521d3a8f422c714c83ea1c7" exitCode=0 Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.283202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" event={"ID":"f061b27b-1e1c-42c2-9230-8f82deda6325","Type":"ContainerDied","Data":"642b6cdffc94c05cafdd62614d483b3daf241708b521d3a8f422c714c83ea1c7"} Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.296515 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-dwcfs" podUID="b023e432-3b4e-4161-bfcc-b5d8b601e9d5" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.305896 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-dwcfs" podUID="b023e432-3b4e-4161-bfcc-b5d8b601e9d5" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.421365 4740 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.491783 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-tj7tn" podUID="4df26838-83be-4000-b37e-841a0457717b" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.503305 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-tj7tn" podUID="4df26838-83be-4000-b37e-841a0457717b" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.830652 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Jan 05 15:10:45 crc kubenswrapper[4740]: [+]has-synced ok Jan 05 15:10:45 crc kubenswrapper[4740]: [-]process-running failed: reason withheld Jan 05 15:10:45 crc kubenswrapper[4740]: healthz check failed Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.830737 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.839860 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-bgrks" podUID="6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:45 crc kubenswrapper[4740]: I0105 15:10:45.840000 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-bgrks" podUID="6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7" containerName="registry-server" probeResult="failure" output=< Jan 05 15:10:45 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:10:45 crc kubenswrapper[4740]: > Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.010875 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.011224 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.069573 4740 patch_prober.go:28] interesting pod/monitoring-plugin-6fcb8d88f7-lcqs4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.069626 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" podUID="c85eb61e-dde6-42e7-b3d9-82837d0104d7" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.069693 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.296053 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-gwgzt_b8f9d516-9c35-42b4-a4d5-e6d189053b5e/console-operator/0.log" Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.296517 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" event={"ID":"b8f9d516-9c35-42b4-a4d5-e6d189053b5e","Type":"ContainerStarted","Data":"03b158a4f24ced35e67f66186cabf785856a7622c59579b38bb2ad5d1e2fdad2"} Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.297411 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.297475 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.821824 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" start-of-body= Jan 05 15:10:46 crc kubenswrapper[4740]: I0105 15:10:46.822165 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.071334 4740 patch_prober.go:28] interesting pod/monitoring-plugin-6fcb8d88f7-lcqs4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.072950 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" podUID="c85eb61e-dde6-42e7-b3d9-82837d0104d7" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.89:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.307497 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mn8lv" event={"ID":"fb20baf5-f4d5-4234-9552-f4d73c447fcc","Type":"ContainerStarted","Data":"d5ae8c9799e80d32257a2b522290ed326cc74a3bb2a42d696f29e9e2f7b00868"} Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.310526 4740 generic.go:334] "Generic (PLEG): container finished" podID="f4272a46-6424-418e-baf7-dba25f1813c4" containerID="fe10947fde16ca0dc4ddddad37e7043c438752dd275cb3f058bb1d9132ca7eaf" exitCode=0 Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.311655 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" event={"ID":"f4272a46-6424-418e-baf7-dba25f1813c4","Type":"ContainerDied","Data":"fe10947fde16ca0dc4ddddad37e7043c438752dd275cb3f058bb1d9132ca7eaf"} Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.311769 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.311716 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.311907 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 05 15:10:47 crc kubenswrapper[4740]: E0105 15:10:47.664174 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e" cmd=["grpc_health_probe","-addr=:50051"] Jan 05 15:10:47 crc kubenswrapper[4740]: E0105 15:10:47.676559 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e" cmd=["grpc_health_probe","-addr=:50051"] Jan 05 15:10:47 crc kubenswrapper[4740]: E0105 15:10:47.678174 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e" cmd=["grpc_health_probe","-addr=:50051"] Jan 05 15:10:47 crc kubenswrapper[4740]: E0105 15:10:47.678202 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-operators/openstack-operator-index-xlq6v" podUID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerName="registry-server" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.690593 4740 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.690625 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.722426 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" podUID="d8ffad98-ed22-4c4c-b0b8-234c3358089e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.722755 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.737513 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7b549fc966-7jlfq" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.822340 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-lz62l" podUID="e2dc84c3-c204-4f17-bcf3-418ab17b873d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.918234 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" podUID="3868391b-95fe-40be-a77d-593ea72fd786" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.918328 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.918706 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:47 crc kubenswrapper[4740]: I0105 15:10:47.918750 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.124256 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.124257 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" podUID="d4245835-8bf3-4491-9e66-a456d2fea83d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.124579 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.124697 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.168404 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" podUID="ca259c15-4c6d-4142-b257-12e805385d3f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.168495 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.209252 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.209312 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-vqb8k" podUID="b73e8f21-70b9-4f4a-b96b-f255e80db992" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.209316 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.250268 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-szhmd" podUID="35577830-7016-49ad-bae0-8a9962a2e82c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.291197 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.291247 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.291294 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" podUID="9d82f35e-307a-4ed2-89e0-0649e5300e41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.291344 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.339603 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-8tvrz" podUID="da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.344475 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-568985c78-7sdkl" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.348104 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" event={"ID":"1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8","Type":"ContainerStarted","Data":"b6e7b237b01fc39ed2ff960d59c2f6927d1481f4179320e9b63a39d4c2aa7a4c"} Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.349760 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.350445 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.350532 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.354671 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" event={"ID":"237e57c0-4e0a-4c9f-89a2-e5a84fb41d22","Type":"ContainerStarted","Data":"8ea077da0789680bfcac3b6743b39d697d0b23a7791dfe50d810c5961aab6f97"} Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.354801 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.358625 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" event={"ID":"afd48a06-614a-4fed-8629-a1a2eb83ab80","Type":"ContainerStarted","Data":"bb4602a08d8ee5ee9d674b4e21567b40fda082f6c54479518be51edf6b1e6366"} Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.359770 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.400243 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-vvlkq" podUID="d1228a5b-52ed-4d7e-940e-b4b03288fae5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.400975 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.401027 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.458320 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-7dc6b6df78-rkxtw" podUID="c7f964a1-5e19-4cb6-8e25-26fdc09410af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.459170 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.501323 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.501374 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.501419 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.502512 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"8dcb6b31199d5e17c148ecc9fd84477e0e7abb7063ee9d6f58e3390f9189f735"} pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" containerMessage="Container operator failed liveness probe, will be restarted" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.502557 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" containerID="cri-o://8dcb6b31199d5e17c148ecc9fd84477e0e7abb7063ee9d6f58e3390f9189f735" gracePeriod=30 Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.585404 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.585398 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-tfjlm" podUID="b8ae6035-6986-4e15-ac19-6e093c0a9e7a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.128:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.585502 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f6f74d6db-jqccs" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.585930 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.586349 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.594289 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.594348 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.723271 4740 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-lpjfp container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.723336 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.723408 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.739601 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-bn857" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.880345 4740 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.880409 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.910324 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.50:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:48 crc kubenswrapper[4740]: I0105 15:10:48.910393 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.100183 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.51:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.100476 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.130915 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-2hj8b" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.134180 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.134260 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.384986 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" event={"ID":"f4272a46-6424-418e-baf7-dba25f1813c4","Type":"ContainerStarted","Data":"32ef1c9cc439accea12cf65d465902e38e476c0e470a8ee21b8bd9bae0ef1234"} Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.386391 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.386531 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.386569 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.392603 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" event={"ID":"f061b27b-1e1c-42c2-9230-8f82deda6325","Type":"ContainerStarted","Data":"ed14fe5abbebc2a7fbd2f16b9a2e1e838e469e4eb4c5e9e4cbab96428148f34b"} Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.392950 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.394135 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.394182 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.396081 4740 generic.go:334] "Generic (PLEG): container finished" podID="3a862b07-6296-43d8-8aff-2a6fdf1bd898" containerID="d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e" exitCode=0 Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.396119 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xlq6v" event={"ID":"3a862b07-6296-43d8-8aff-2a6fdf1bd898","Type":"ContainerDied","Data":"d293b1f8cf8295731773ed4b75572b75d30b82c2b66a5eb2dff2c0744859165e"} Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.399829 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.399879 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.553208 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.553257 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.553277 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.553324 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.734078 4740 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-lpjfp container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.734141 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" podUID="b13bfd05-5a88-449b-9d26-f11acf9c6bbf" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.12:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.735894 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.874251 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" podUID="6faf05ee-49e0-4d3e-afcd-d11d9494da44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.874307 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" podUID="6faf05ee-49e0-4d3e-afcd-d11d9494da44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.874667 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.951659 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" podUID="3f0a6bbe-32b3-4e4d-afef-32e871616c6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:49 crc kubenswrapper[4740]: I0105 15:10:49.951734 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-6d99759cf-mglsd" podUID="3f0a6bbe-32b3-4e4d-afef-32e871616c6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.151525 4740 patch_prober.go:28] interesting pod/loki-operator-controller-manager-56d45b676b-q44gh container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": read tcp 10.217.0.2:55882->10.217.0.48:8081: read: connection reset by peer" start-of-body= Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.151601 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" podUID="50147f9c-3a52-4e0e-b0cc-1fd94e7def10" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": read tcp 10.217.0.2:55882->10.217.0.48:8081: read: connection reset by peer" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.411720 4740 generic.go:334] "Generic (PLEG): container finished" podID="45f52c16-f526-4498-bc85-2aec3b292a60" containerID="8dcb6b31199d5e17c148ecc9fd84477e0e7abb7063ee9d6f58e3390f9189f735" exitCode=0 Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.411790 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" event={"ID":"45f52c16-f526-4498-bc85-2aec3b292a60","Type":"ContainerDied","Data":"8dcb6b31199d5e17c148ecc9fd84477e0e7abb7063ee9d6f58e3390f9189f735"} Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.414248 4740 generic.go:334] "Generic (PLEG): container finished" podID="cb54fddc-8710-4066-908b-bb7a00a15c7e" containerID="46af320ca584a7bd1ea5544687ffc6b2fa65c91fe544c8fefd1f91910415e9db" exitCode=1 Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.414319 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" event={"ID":"cb54fddc-8710-4066-908b-bb7a00a15c7e","Type":"ContainerDied","Data":"46af320ca584a7bd1ea5544687ffc6b2fa65c91fe544c8fefd1f91910415e9db"} Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.417036 4740 scope.go:117] "RemoveContainer" containerID="46af320ca584a7bd1ea5544687ffc6b2fa65c91fe544c8fefd1f91910415e9db" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.444329 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b922e17-0ca9-49cd-8af7-b78776b990bb","Type":"ContainerStarted","Data":"7ebe37a0aaf591999a2fe1e749f28f793eb6995877d20de18b4f7799948ca1e0"} Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.457667 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xlq6v" event={"ID":"3a862b07-6296-43d8-8aff-2a6fdf1bd898","Type":"ContainerStarted","Data":"893f96ceb20ea0e2ec359afcb4ce9c78e793334c2cf998b020b0121051b09396"} Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.502540 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" podUID="49bbef73-8653-4747-93ee-35819a394b1f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.129:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.502586 4740 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.503050 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-779f597f97-v7z84" podUID="49bbef73-8653-4747-93ee-35819a394b1f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.129:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.503222 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" start-of-body= Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.503248 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.503275 4740 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-n6tdc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.503336 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" podUID="1cb9cbf8-eb98-4ea1-ad3e-0be2ba14c1e8" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.504166 4740 patch_prober.go:28] interesting pod/route-controller-manager-bf4f87789-2p768 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.504197 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" podUID="f4272a46-6424-418e-baf7-dba25f1813c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.550583 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-78948ddfd7m88wc" Jan 05 15:10:50 crc kubenswrapper[4740]: I0105 15:10:50.735322 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" probeResult="failure" output="command timed out" Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.369237 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.369272 4740 patch_prober.go:28] interesting pod/console-operator-58897d9998-gwgzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.369301 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.369326 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" podUID="b8f9d516-9c35-42b4-a4d5-e6d189053b5e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.480667 4740 generic.go:334] "Generic (PLEG): container finished" podID="50147f9c-3a52-4e0e-b0cc-1fd94e7def10" containerID="ede59bfc0d35a45fc093143ecbb937450674fb8acdfe859a924be5f23761f648" exitCode=1 Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.480933 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" event={"ID":"50147f9c-3a52-4e0e-b0cc-1fd94e7def10","Type":"ContainerDied","Data":"ede59bfc0d35a45fc093143ecbb937450674fb8acdfe859a924be5f23761f648"} Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.481668 4740 scope.go:117] "RemoveContainer" containerID="ede59bfc0d35a45fc093143ecbb937450674fb8acdfe859a924be5f23761f648" Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.488815 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" event={"ID":"cb54fddc-8710-4066-908b-bb7a00a15c7e","Type":"ContainerStarted","Data":"1f7e86435a98bd7cdb60b3de590a26cd74be7ba812a57d99b9866b8e6bea00f1"} Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.489991 4740 status_manager.go:317] "Container readiness changed for unknown container" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" containerID="cri-o://46af320ca584a7bd1ea5544687ffc6b2fa65c91fe544c8fefd1f91910415e9db" Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.490043 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.833525 4740 trace.go:236] Trace[1803930261]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (05-Jan-2026 15:10:48.436) (total time: 3396ms): Jan 05 15:10:51 crc kubenswrapper[4740]: Trace[1803930261]: [3.396235787s] [3.396235787s] END Jan 05 15:10:51 crc kubenswrapper[4740]: I0105 15:10:51.833524 4740 trace.go:236] Trace[1871331210]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (05-Jan-2026 15:10:47.356) (total time: 4476ms): Jan 05 15:10:51 crc kubenswrapper[4740]: Trace[1871331210]: [4.476042485s] [4.476042485s] END Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.511202 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" event={"ID":"45f52c16-f526-4498-bc85-2aec3b292a60","Type":"ContainerStarted","Data":"caf221e4e625e7e59678067395ebd50f9d26f9d6c3ef237f72d27e48456f8ad6"} Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.512813 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.513233 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": dial tcp 10.217.0.5:8081: connect: connection refused" start-of-body= Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.513266 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": dial tcp 10.217.0.5:8081: connect: connection refused" Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.518208 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" event={"ID":"50147f9c-3a52-4e0e-b0cc-1fd94e7def10","Type":"ContainerStarted","Data":"1b508ddaa93016646833d5532d5911130d989c13061b9bad423bba8e32c8fdae"} Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.518585 4740 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" containerID="cri-o://ede59bfc0d35a45fc093143ecbb937450674fb8acdfe859a924be5f23761f648" Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.518630 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.518659 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 15:10:52 crc kubenswrapper[4740]: I0105 15:10:52.865010 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 05 15:10:53 crc kubenswrapper[4740]: I0105 15:10:53.527388 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 15:10:53 crc kubenswrapper[4740]: I0105 15:10:53.532527 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": dial tcp 10.217.0.5:8081: connect: connection refused" start-of-body= Jan 05 15:10:53 crc kubenswrapper[4740]: I0105 15:10:53.532585 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": dial tcp 10.217.0.5:8081: connect: connection refused" Jan 05 15:10:53 crc kubenswrapper[4740]: I0105 15:10:53.997262 4740 patch_prober.go:28] interesting pod/console-78554f9b97-scjhq container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:53 crc kubenswrapper[4740]: I0105 15:10:53.997601 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78554f9b97-scjhq" podUID="322864a4-da88-47f5-9d44-89a38dd2d8f3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.145:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:54 crc kubenswrapper[4740]: I0105 15:10:54.542350 4740 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lfg5l container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.5:8081/healthz\": dial tcp 10.217.0.5:8081: connect: connection refused" start-of-body= Jan 05 15:10:54 crc kubenswrapper[4740]: I0105 15:10:54.542402 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" podUID="45f52c16-f526-4498-bc85-2aec3b292a60" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.5:8081/healthz\": dial tcp 10.217.0.5:8081: connect: connection refused" Jan 05 15:10:54 crc kubenswrapper[4740]: I0105 15:10:54.617021 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": dial tcp 10.217.0.44:6080: connect: connection refused" Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.127145 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Jan 05 15:10:55 crc kubenswrapper[4740]: [+]has-synced ok Jan 05 15:10:55 crc kubenswrapper[4740]: [-]process-running failed: reason withheld Jan 05 15:10:55 crc kubenswrapper[4740]: healthz check failed Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.127442 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.357376 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" containerID="cri-o://7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e" gracePeriod=17 Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.368397 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" containerID="cri-o://835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888" gracePeriod=15 Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.420195 4740 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-ffh7k" podUID="3c26b9da-cc4e-44dc-92ba-92e42b962010" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.555814 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-f8vfq_61ccfc6e-26f3-473d-9a65-40ca39aafca2/router/0.log" Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.555972 4740 generic.go:334] "Generic (PLEG): container finished" podID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerID="14d1a0dd9b47a01c1c3209c3aef94dfbcb5c477c7bd10a123ffa701476b8da1d" exitCode=137 Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.556005 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f8vfq" event={"ID":"61ccfc6e-26f3-473d-9a65-40ca39aafca2","Type":"ContainerDied","Data":"14d1a0dd9b47a01c1c3209c3aef94dfbcb5c477c7bd10a123ffa701476b8da1d"} Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.714240 4740 patch_prober.go:28] interesting pod/metrics-server-85559f775f-nmxz8 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.714665 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" podUID="8e7d78e7-6855-46c1-a51c-7f4127c80b7d" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.757913 4740 patch_prober.go:28] interesting pod/metrics-server-85559f775f-nmxz8 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.757978 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-85559f775f-nmxz8" podUID="8e7d78e7-6855-46c1-a51c-7f4127c80b7d" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.88:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:55 crc kubenswrapper[4740]: I0105 15:10:55.825668 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6fcb8d88f7-lcqs4" Jan 05 15:10:56 crc kubenswrapper[4740]: I0105 15:10:56.016631 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bf4f87789-2p768" Jan 05 15:10:56 crc kubenswrapper[4740]: I0105 15:10:56.570377 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-f8vfq_61ccfc6e-26f3-473d-9a65-40ca39aafca2/router/0.log" Jan 05 15:10:56 crc kubenswrapper[4740]: I0105 15:10:56.570459 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f8vfq" event={"ID":"61ccfc6e-26f3-473d-9a65-40ca39aafca2","Type":"ContainerStarted","Data":"fcd3b23dafa62f3a8515fec295269767ecf8fc50e6b72b04df95992065815d48"} Jan 05 15:10:56 crc kubenswrapper[4740]: I0105 15:10:56.822655 4740 patch_prober.go:28] interesting pod/controller-manager-6cdbbbb5bc-rbd47 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" start-of-body= Jan 05 15:10:56 crc kubenswrapper[4740]: I0105 15:10:56.823009 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" podUID="f061b27b-1e1c-42c2-9230-8f82deda6325" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.81:8443/healthz\": dial tcp 10.217.0.81:8443: connect: connection refused" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.039799 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" podUID="cb54fddc-8710-4066-908b-bb7a00a15c7e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": dial tcp 10.217.0.114:8081: connect: connection refused" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.133757 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.135704 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.135757 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.533518 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-lfg5l" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.657297 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.657890 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 15:10:57 crc kubenswrapper[4740]: E0105 15:10:57.835012 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 05 15:10:57 crc kubenswrapper[4740]: E0105 15:10:57.836698 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 05 15:10:57 crc kubenswrapper[4740]: E0105 15:10:57.839273 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 05 15:10:57 crc kubenswrapper[4740]: E0105 15:10:57.839405 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerName="galera" Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.909189 4740 patch_prober.go:28] interesting pod/logging-loki-distributor-5f678c8dd6-bdcsh container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:57 crc kubenswrapper[4740]: I0105 15:10:57.909281 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" podUID="7927742d-54ad-4fbb-841a-71d40648d88e" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.50:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.104150 4740 patch_prober.go:28] interesting pod/logging-loki-querier-76788598db-n78fw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.104201 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" podUID="5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.51:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.144681 4740 patch_prober.go:28] interesting pod/logging-loki-query-frontend-69d9546745-lbmjc container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.144734 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" podUID="d8fd9857-fca2-4041-9c72-3747c84b6987" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.147300 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.147379 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.176210 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": dial tcp 10.217.0.101:7472: connect: connection refused" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.176250 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" podUID="939f9254-d7a5-48cb-8ab1-7ea2e4f68610" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.101:7472/metrics\": dial tcp 10.217.0.101:7472: connect: connection refused" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.230828 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-lpjfp" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.335198 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="623b4799-3a7c-4cba-8632-420ef0704992" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.595145 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.595692 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8svgw container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.595764 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.53:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.595697 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8svgw" podUID="fd41c01f-dff3-4b6a-ae38-8b114b384a59" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.53:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.733467 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.733462 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="bf526b0a-998c-4943-bfba-04352421ed58" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.940180 4740 patch_prober.go:28] interesting pod/logging-loki-gateway-66cd7bf4cd-8vc2p container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:8081/ready\": context deadline exceeded" start-of-body= Jan 05 15:10:58 crc kubenswrapper[4740]: I0105 15:10:58.940461 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-66cd7bf4cd-8vc2p" podUID="1e34aa71-7c05-4606-a71a-2c5b20667ba1" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.54:8081/ready\": context deadline exceeded" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.134981 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.135096 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.231098 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.252001 4740 trace.go:236] Trace[1101410893]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (05-Jan-2026 15:10:55.861) (total time: 3389ms): Jan 05 15:10:59 crc kubenswrapper[4740]: Trace[1101410893]: [3.389365811s] [3.389365811s] END Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.372563 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": dial tcp 10.217.0.102:7572: connect: connection refused" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.372743 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" podUID="afd48a06-614a-4fed-8629-a1a2eb83ab80" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.102:7572/metrics\": dial tcp 10.217.0.102:7572: connect: connection refused" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.388572 4740 patch_prober.go:28] interesting pod/loki-operator-controller-manager-56d45b676b-q44gh container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" start-of-body= Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.388629 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" podUID="50147f9c-3a52-4e0e-b0cc-1fd94e7def10" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.421515 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="623b4799-3a7c-4cba-8632-420ef0704992" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.616685 4740 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" podUID="237e57c0-4e0a-4c9f-89a2-e5a84fb41d22" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": dial tcp 10.217.0.44:6080: connect: connection refused" Jan 05 15:10:59 crc kubenswrapper[4740]: E0105 15:10:59.677802 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 05 15:10:59 crc kubenswrapper[4740]: E0105 15:10:59.681884 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 05 15:10:59 crc kubenswrapper[4740]: E0105 15:10:59.683677 4740 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 05 15:10:59 crc kubenswrapper[4740]: E0105 15:10:59.683736 4740 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="261c319a-37da-4987-a774-ecc24fa6b083" containerName="galera" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.783561 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-n6tdc" Jan 05 15:10:59 crc kubenswrapper[4740]: I0105 15:10:59.952709 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xlq6v" Jan 05 15:11:00 crc kubenswrapper[4740]: I0105 15:11:00.230531 4740 patch_prober.go:28] interesting pod/router-default-5444994796-f8vfq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 05 15:11:00 crc kubenswrapper[4740]: [-]has-synced failed: reason withheld Jan 05 15:11:00 crc kubenswrapper[4740]: [+]process-running ok Jan 05 15:11:00 crc kubenswrapper[4740]: healthz check failed Jan 05 15:11:00 crc kubenswrapper[4740]: I0105 15:11:00.230612 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f8vfq" podUID="61ccfc6e-26f3-473d-9a65-40ca39aafca2" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:11:00 crc kubenswrapper[4740]: I0105 15:11:00.350626 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ffh7k" Jan 05 15:11:01 crc kubenswrapper[4740]: I0105 15:11:01.152934 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 15:11:01 crc kubenswrapper[4740]: I0105 15:11:01.173239 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 15:11:01 crc kubenswrapper[4740]: I0105 15:11:01.173587 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f8vfq" Jan 05 15:11:01 crc kubenswrapper[4740]: I0105 15:11:01.428612 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gwgzt" Jan 05 15:11:01 crc kubenswrapper[4740]: I0105 15:11:01.661289 4740 generic.go:334] "Generic (PLEG): container finished" podID="261c319a-37da-4987-a774-ecc24fa6b083" containerID="7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e" exitCode=0 Jan 05 15:11:01 crc kubenswrapper[4740]: I0105 15:11:01.662757 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"261c319a-37da-4987-a774-ecc24fa6b083","Type":"ContainerDied","Data":"7bb6f8c6db25449b16a74dc5908883d2bd37222f79207529ff4c13d3419dd47e"} Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.451681 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="623b4799-3a7c-4cba-8632-420ef0704992" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.452000 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.453193 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"5723b4f51c66bf05ad0b480f2a668450cf5919bea8d5559edf38b075fbf0d323"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.453248 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="623b4799-3a7c-4cba-8632-420ef0704992" containerName="cinder-scheduler" containerID="cri-o://5723b4f51c66bf05ad0b480f2a668450cf5919bea8d5559edf38b075fbf0d323" gracePeriod=30 Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.683157 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"261c319a-37da-4987-a774-ecc24fa6b083","Type":"ContainerStarted","Data":"75a37d71b3f0c9350eeec10019d175444dd8db3c045be8109ee1529fd2c1fb38"} Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.686585 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbb5263b-e98b-48a4-825e-ffb99738059f" containerID="835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888" exitCode=0 Jan 05 15:11:02 crc kubenswrapper[4740]: I0105 15:11:02.686621 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbb5263b-e98b-48a4-825e-ffb99738059f","Type":"ContainerDied","Data":"835db857eed7aafbfc05fddb2e1acc2bfca45dd9afefccb55f578e40d71c0888"} Jan 05 15:11:03 crc kubenswrapper[4740]: I0105 15:11:03.007167 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78554f9b97-scjhq" Jan 05 15:11:03 crc kubenswrapper[4740]: I0105 15:11:03.703966 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbb5263b-e98b-48a4-825e-ffb99738059f","Type":"ContainerStarted","Data":"d104c2e0251269fb59c983f6895e970b5ccb9c9465f306af8d8f842d4b9d9341"} Jan 05 15:11:04 crc kubenswrapper[4740]: I0105 15:11:04.618735 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-zc946" Jan 05 15:11:06 crc kubenswrapper[4740]: I0105 15:11:06.769152 4740 generic.go:334] "Generic (PLEG): container finished" podID="623b4799-3a7c-4cba-8632-420ef0704992" containerID="5723b4f51c66bf05ad0b480f2a668450cf5919bea8d5559edf38b075fbf0d323" exitCode=0 Jan 05 15:11:06 crc kubenswrapper[4740]: I0105 15:11:06.769862 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"623b4799-3a7c-4cba-8632-420ef0704992","Type":"ContainerDied","Data":"5723b4f51c66bf05ad0b480f2a668450cf5919bea8d5559edf38b075fbf0d323"} Jan 05 15:11:06 crc kubenswrapper[4740]: I0105 15:11:06.825599 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdbbbb5bc-rbd47" Jan 05 15:11:06 crc kubenswrapper[4740]: I0105 15:11:06.927155 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-bdcsh" Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.044173 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f99f54bc8-9s2d8" Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.103449 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-n78fw" Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.149044 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-lbmjc" Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.783325 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"623b4799-3a7c-4cba-8632-420ef0704992","Type":"ContainerStarted","Data":"3aadfaf84c7888ec102f514336786b1eb5782a86494ad661eefc7876d0a7568e"} Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.833304 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.833360 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 05 15:11:07 crc kubenswrapper[4740]: I0105 15:11:07.943717 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 05 15:11:08 crc kubenswrapper[4740]: I0105 15:11:08.175512 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d9475cc54-vwc79" Jan 05 15:11:08 crc kubenswrapper[4740]: I0105 15:11:08.893398 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 05 15:11:09 crc kubenswrapper[4740]: I0105 15:11:09.377161 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qhkx7" Jan 05 15:11:09 crc kubenswrapper[4740]: I0105 15:11:09.417400 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-56d45b676b-q44gh" Jan 05 15:11:09 crc kubenswrapper[4740]: I0105 15:11:09.675251 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 05 15:11:09 crc kubenswrapper[4740]: I0105 15:11:09.675302 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 05 15:11:09 crc kubenswrapper[4740]: I0105 15:11:09.770411 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 05 15:11:09 crc kubenswrapper[4740]: I0105 15:11:09.906866 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 05 15:11:12 crc kubenswrapper[4740]: I0105 15:11:12.405567 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 05 15:11:12 crc kubenswrapper[4740]: I0105 15:11:12.517864 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.549442 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ghq9"] Jan 05 15:12:09 crc kubenswrapper[4740]: E0105 15:12:09.553597 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="extract-content" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.553987 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="extract-content" Jan 05 15:12:09 crc kubenswrapper[4740]: E0105 15:12:09.554158 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="registry-server" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.554192 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="registry-server" Jan 05 15:12:09 crc kubenswrapper[4740]: E0105 15:12:09.554265 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="extract-utilities" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.554280 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="extract-utilities" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.555460 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="1767a202-36d4-4ff0-928b-6871274b165a" containerName="registry-server" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.561636 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.573466 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ghq9"] Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.678087 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-catalog-content\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.678153 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-utilities\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.678255 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8sp\" (UniqueName: \"kubernetes.io/projected/fd6f794f-d34c-461d-b31c-9341ab1eda47-kube-api-access-hs8sp\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.780304 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-catalog-content\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.780355 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-utilities\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.780582 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8sp\" (UniqueName: \"kubernetes.io/projected/fd6f794f-d34c-461d-b31c-9341ab1eda47-kube-api-access-hs8sp\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.782202 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-utilities\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.782755 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-catalog-content\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.804043 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8sp\" (UniqueName: \"kubernetes.io/projected/fd6f794f-d34c-461d-b31c-9341ab1eda47-kube-api-access-hs8sp\") pod \"redhat-operators-5ghq9\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:09 crc kubenswrapper[4740]: I0105 15:12:09.900858 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:10 crc kubenswrapper[4740]: I0105 15:12:10.665941 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ghq9"] Jan 05 15:12:11 crc kubenswrapper[4740]: I0105 15:12:11.675924 4740 generic.go:334] "Generic (PLEG): container finished" podID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerID="87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347" exitCode=0 Jan 05 15:12:11 crc kubenswrapper[4740]: I0105 15:12:11.676004 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerDied","Data":"87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347"} Jan 05 15:12:11 crc kubenswrapper[4740]: I0105 15:12:11.676434 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerStarted","Data":"78a5cd16ee65fb5084053b36c6f501ae4a5d2ba83afc7c7b2e367e42e64eec71"} Jan 05 15:12:14 crc kubenswrapper[4740]: I0105 15:12:14.710877 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerStarted","Data":"e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3"} Jan 05 15:12:17 crc kubenswrapper[4740]: I0105 15:12:17.754837 4740 generic.go:334] "Generic (PLEG): container finished" podID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerID="e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3" exitCode=0 Jan 05 15:12:17 crc kubenswrapper[4740]: I0105 15:12:17.754942 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerDied","Data":"e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3"} Jan 05 15:12:19 crc kubenswrapper[4740]: I0105 15:12:19.781918 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerStarted","Data":"56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff"} Jan 05 15:12:19 crc kubenswrapper[4740]: I0105 15:12:19.813473 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ghq9" podStartSLOduration=3.693346178 podStartE2EDuration="10.813177993s" podCreationTimestamp="2026-01-05 15:12:09 +0000 UTC" firstStartedPulling="2026-01-05 15:12:11.677742634 +0000 UTC m=+4980.984651213" lastFinishedPulling="2026-01-05 15:12:18.797574399 +0000 UTC m=+4988.104483028" observedRunningTime="2026-01-05 15:12:19.800861144 +0000 UTC m=+4989.107769743" watchObservedRunningTime="2026-01-05 15:12:19.813177993 +0000 UTC m=+4989.120086572" Jan 05 15:12:19 crc kubenswrapper[4740]: I0105 15:12:19.901717 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:19 crc kubenswrapper[4740]: I0105 15:12:19.901777 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:20 crc kubenswrapper[4740]: I0105 15:12:20.963937 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ghq9" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="registry-server" probeResult="failure" output=< Jan 05 15:12:20 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:12:20 crc kubenswrapper[4740]: > Jan 05 15:12:30 crc kubenswrapper[4740]: I0105 15:12:30.964336 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ghq9" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="registry-server" probeResult="failure" output=< Jan 05 15:12:30 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:12:30 crc kubenswrapper[4740]: > Jan 05 15:12:39 crc kubenswrapper[4740]: I0105 15:12:39.974613 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:40 crc kubenswrapper[4740]: I0105 15:12:40.047014 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:40 crc kubenswrapper[4740]: I0105 15:12:40.738393 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ghq9"] Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.034734 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5ghq9" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="registry-server" containerID="cri-o://56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff" gracePeriod=2 Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.649131 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.757232 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-catalog-content\") pod \"fd6f794f-d34c-461d-b31c-9341ab1eda47\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.757385 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-utilities\") pod \"fd6f794f-d34c-461d-b31c-9341ab1eda47\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.757444 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs8sp\" (UniqueName: \"kubernetes.io/projected/fd6f794f-d34c-461d-b31c-9341ab1eda47-kube-api-access-hs8sp\") pod \"fd6f794f-d34c-461d-b31c-9341ab1eda47\" (UID: \"fd6f794f-d34c-461d-b31c-9341ab1eda47\") " Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.759442 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-utilities" (OuterVolumeSpecName: "utilities") pod "fd6f794f-d34c-461d-b31c-9341ab1eda47" (UID: "fd6f794f-d34c-461d-b31c-9341ab1eda47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.768424 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6f794f-d34c-461d-b31c-9341ab1eda47-kube-api-access-hs8sp" (OuterVolumeSpecName: "kube-api-access-hs8sp") pod "fd6f794f-d34c-461d-b31c-9341ab1eda47" (UID: "fd6f794f-d34c-461d-b31c-9341ab1eda47"). InnerVolumeSpecName "kube-api-access-hs8sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.860109 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.860391 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs8sp\" (UniqueName: \"kubernetes.io/projected/fd6f794f-d34c-461d-b31c-9341ab1eda47-kube-api-access-hs8sp\") on node \"crc\" DevicePath \"\"" Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.906675 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd6f794f-d34c-461d-b31c-9341ab1eda47" (UID: "fd6f794f-d34c-461d-b31c-9341ab1eda47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:12:41 crc kubenswrapper[4740]: I0105 15:12:41.962291 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6f794f-d34c-461d-b31c-9341ab1eda47-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.071383 4740 generic.go:334] "Generic (PLEG): container finished" podID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerID="56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff" exitCode=0 Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.071440 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerDied","Data":"56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff"} Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.071477 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ghq9" event={"ID":"fd6f794f-d34c-461d-b31c-9341ab1eda47","Type":"ContainerDied","Data":"78a5cd16ee65fb5084053b36c6f501ae4a5d2ba83afc7c7b2e367e42e64eec71"} Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.071505 4740 scope.go:117] "RemoveContainer" containerID="56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.071767 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ghq9" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.123272 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ghq9"] Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.127820 4740 scope.go:117] "RemoveContainer" containerID="e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.137215 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5ghq9"] Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.163820 4740 scope.go:117] "RemoveContainer" containerID="87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.223608 4740 scope.go:117] "RemoveContainer" containerID="56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff" Jan 05 15:12:42 crc kubenswrapper[4740]: E0105 15:12:42.225045 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff\": container with ID starting with 56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff not found: ID does not exist" containerID="56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.225360 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff"} err="failed to get container status \"56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff\": rpc error: code = NotFound desc = could not find container \"56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff\": container with ID starting with 56a32610728ac1befd5e667afe8b271af91ad51d38edca531edeab33a44257ff not found: ID does not exist" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.225405 4740 scope.go:117] "RemoveContainer" containerID="e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3" Jan 05 15:12:42 crc kubenswrapper[4740]: E0105 15:12:42.225981 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3\": container with ID starting with e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3 not found: ID does not exist" containerID="e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.226033 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3"} err="failed to get container status \"e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3\": rpc error: code = NotFound desc = could not find container \"e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3\": container with ID starting with e2339f429e41dc70aca6cdf1777e4d35612d1faa75fb8acea5782a5d2774d0d3 not found: ID does not exist" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.226085 4740 scope.go:117] "RemoveContainer" containerID="87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347" Jan 05 15:12:42 crc kubenswrapper[4740]: E0105 15:12:42.226404 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347\": container with ID starting with 87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347 not found: ID does not exist" containerID="87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.226451 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347"} err="failed to get container status \"87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347\": rpc error: code = NotFound desc = could not find container \"87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347\": container with ID starting with 87b425dbd093c87af7f3b9034bbe0033e831c83548738f7116346ce2d12a6347 not found: ID does not exist" Jan 05 15:12:42 crc kubenswrapper[4740]: I0105 15:12:42.986615 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" path="/var/lib/kubelet/pods/fd6f794f-d34c-461d-b31c-9341ab1eda47/volumes" Jan 05 15:13:01 crc kubenswrapper[4740]: I0105 15:13:01.915547 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:13:01 crc kubenswrapper[4740]: I0105 15:13:01.916339 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:13:18 crc kubenswrapper[4740]: I0105 15:13:18.532616 4740 generic.go:334] "Generic (PLEG): container finished" podID="efc1648a-7270-45c6-af93-bd4b641931d2" containerID="43d04121a6ef67b06a60a9130b318784924d123d066d8260a4de16e86f97afea" exitCode=1 Jan 05 15:13:18 crc kubenswrapper[4740]: I0105 15:13:18.532788 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"efc1648a-7270-45c6-af93-bd4b641931d2","Type":"ContainerDied","Data":"43d04121a6ef67b06a60a9130b318784924d123d066d8260a4de16e86f97afea"} Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.023115 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.123766 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config-secret\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.124656 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ssh-key\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.124712 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt95g\" (UniqueName: \"kubernetes.io/projected/efc1648a-7270-45c6-af93-bd4b641931d2-kube-api-access-kt95g\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.125688 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ca-certs\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.125743 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-workdir\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.125773 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-temporary\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.125862 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.125891 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.126011 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-config-data\") pod \"efc1648a-7270-45c6-af93-bd4b641931d2\" (UID: \"efc1648a-7270-45c6-af93-bd4b641931d2\") " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.127268 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.129353 4740 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.130510 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-config-data" (OuterVolumeSpecName: "config-data") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.134428 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.138796 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc1648a-7270-45c6-af93-bd4b641931d2-kube-api-access-kt95g" (OuterVolumeSpecName: "kube-api-access-kt95g") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "kube-api-access-kt95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.139548 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.162429 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.163416 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.174016 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.194750 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "efc1648a-7270-45c6-af93-bd4b641931d2" (UID: "efc1648a-7270-45c6-af93-bd4b641931d2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231522 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231559 4740 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231569 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt95g\" (UniqueName: \"kubernetes.io/projected/efc1648a-7270-45c6-af93-bd4b641931d2-kube-api-access-kt95g\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231577 4740 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/efc1648a-7270-45c6-af93-bd4b641931d2-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231587 4740 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/efc1648a-7270-45c6-af93-bd4b641931d2-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231598 4740 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231631 4740 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.231644 4740 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efc1648a-7270-45c6-af93-bd4b641931d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.266950 4740 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.333719 4740 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.557180 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"efc1648a-7270-45c6-af93-bd4b641931d2","Type":"ContainerDied","Data":"1844a5a25e10f8ff3d5935c66e67e7944b8553c3067d19869395ab97b855e364"} Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.557552 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1844a5a25e10f8ff3d5935c66e67e7944b8553c3067d19869395ab97b855e364" Jan 05 15:13:20 crc kubenswrapper[4740]: I0105 15:13:20.557391 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.484398 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 05 15:13:31 crc kubenswrapper[4740]: E0105 15:13:31.486274 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="registry-server" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.486313 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="registry-server" Jan 05 15:13:31 crc kubenswrapper[4740]: E0105 15:13:31.486388 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="extract-content" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.486406 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="extract-content" Jan 05 15:13:31 crc kubenswrapper[4740]: E0105 15:13:31.486452 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="extract-utilities" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.486469 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="extract-utilities" Jan 05 15:13:31 crc kubenswrapper[4740]: E0105 15:13:31.486504 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc1648a-7270-45c6-af93-bd4b641931d2" containerName="tempest-tests-tempest-tests-runner" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.486522 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc1648a-7270-45c6-af93-bd4b641931d2" containerName="tempest-tests-tempest-tests-runner" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.487172 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc1648a-7270-45c6-af93-bd4b641931d2" containerName="tempest-tests-tempest-tests-runner" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.487275 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6f794f-d34c-461d-b31c-9341ab1eda47" containerName="registry-server" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.489312 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.499624 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jv2n5" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.499916 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.543044 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.543266 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q52cp\" (UniqueName: \"kubernetes.io/projected/990a893a-383c-432f-822e-a3da59d882c1-kube-api-access-q52cp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.645455 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.645817 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q52cp\" (UniqueName: \"kubernetes.io/projected/990a893a-383c-432f-822e-a3da59d882c1-kube-api-access-q52cp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.646435 4740 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.669845 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q52cp\" (UniqueName: \"kubernetes.io/projected/990a893a-383c-432f-822e-a3da59d882c1-kube-api-access-q52cp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.686305 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"990a893a-383c-432f-822e-a3da59d882c1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.812111 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.916805 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:13:31 crc kubenswrapper[4740]: I0105 15:13:31.917210 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:13:32 crc kubenswrapper[4740]: I0105 15:13:32.292993 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 05 15:13:32 crc kubenswrapper[4740]: I0105 15:13:32.702676 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"990a893a-383c-432f-822e-a3da59d882c1","Type":"ContainerStarted","Data":"acfc27c6c7b3dcdadc930ff9c60975ccb18e354e2023807b0d360dab6788eec9"} Jan 05 15:13:35 crc kubenswrapper[4740]: I0105 15:13:35.756834 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"990a893a-383c-432f-822e-a3da59d882c1","Type":"ContainerStarted","Data":"ebdd335d11a3f56417adfae05c4e99b4b44423b7d0e0b7fbb27e70974e638b88"} Jan 05 15:13:35 crc kubenswrapper[4740]: I0105 15:13:35.782839 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.542635647 podStartE2EDuration="4.782802391s" podCreationTimestamp="2026-01-05 15:13:31 +0000 UTC" firstStartedPulling="2026-01-05 15:13:32.296733201 +0000 UTC m=+5061.603641780" lastFinishedPulling="2026-01-05 15:13:33.536899935 +0000 UTC m=+5062.843808524" observedRunningTime="2026-01-05 15:13:35.774460028 +0000 UTC m=+5065.081368647" watchObservedRunningTime="2026-01-05 15:13:35.782802391 +0000 UTC m=+5065.089711010" Jan 05 15:14:01 crc kubenswrapper[4740]: I0105 15:14:01.915909 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:14:01 crc kubenswrapper[4740]: I0105 15:14:01.916462 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:14:01 crc kubenswrapper[4740]: I0105 15:14:01.916506 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 15:14:01 crc kubenswrapper[4740]: I0105 15:14:01.917509 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b8a6b87265d54ac4ffe376ba3cafb17fa6677039615a1c5d95bd77a60f25bca"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 15:14:01 crc kubenswrapper[4740]: I0105 15:14:01.917576 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://5b8a6b87265d54ac4ffe376ba3cafb17fa6677039615a1c5d95bd77a60f25bca" gracePeriod=600 Jan 05 15:14:03 crc kubenswrapper[4740]: I0105 15:14:03.118495 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="5b8a6b87265d54ac4ffe376ba3cafb17fa6677039615a1c5d95bd77a60f25bca" exitCode=0 Jan 05 15:14:03 crc kubenswrapper[4740]: I0105 15:14:03.118563 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"5b8a6b87265d54ac4ffe376ba3cafb17fa6677039615a1c5d95bd77a60f25bca"} Jan 05 15:14:03 crc kubenswrapper[4740]: I0105 15:14:03.118896 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b"} Jan 05 15:14:03 crc kubenswrapper[4740]: I0105 15:14:03.118914 4740 scope.go:117] "RemoveContainer" containerID="18626377d5da1f71276cc44103997640c4a829d9add5a849395c134ee6ad45be" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.700771 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fpnwq/must-gather-9qwfr"] Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.706138 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.709897 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fpnwq"/"openshift-service-ca.crt" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.709979 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fpnwq"/"kube-root-ca.crt" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.709906 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fpnwq"/"default-dockercfg-f54dq" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.717112 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fpnwq/must-gather-9qwfr"] Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.806469 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85987fd-4f1a-4e66-bfab-53cd215c5a50-must-gather-output\") pod \"must-gather-9qwfr\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.806907 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx997\" (UniqueName: \"kubernetes.io/projected/e85987fd-4f1a-4e66-bfab-53cd215c5a50-kube-api-access-kx997\") pod \"must-gather-9qwfr\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.909097 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85987fd-4f1a-4e66-bfab-53cd215c5a50-must-gather-output\") pod \"must-gather-9qwfr\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.909241 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx997\" (UniqueName: \"kubernetes.io/projected/e85987fd-4f1a-4e66-bfab-53cd215c5a50-kube-api-access-kx997\") pod \"must-gather-9qwfr\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.909584 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85987fd-4f1a-4e66-bfab-53cd215c5a50-must-gather-output\") pod \"must-gather-9qwfr\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:13 crc kubenswrapper[4740]: I0105 15:14:13.927500 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx997\" (UniqueName: \"kubernetes.io/projected/e85987fd-4f1a-4e66-bfab-53cd215c5a50-kube-api-access-kx997\") pod \"must-gather-9qwfr\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:14 crc kubenswrapper[4740]: I0105 15:14:14.031934 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:14:14 crc kubenswrapper[4740]: I0105 15:14:14.564400 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fpnwq/must-gather-9qwfr"] Jan 05 15:14:15 crc kubenswrapper[4740]: I0105 15:14:15.300362 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" event={"ID":"e85987fd-4f1a-4e66-bfab-53cd215c5a50","Type":"ContainerStarted","Data":"8c8b350f064c283ec07efba31cf935e8316966d2adda1e4f97290543dea569f3"} Jan 05 15:14:23 crc kubenswrapper[4740]: I0105 15:14:23.395701 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" event={"ID":"e85987fd-4f1a-4e66-bfab-53cd215c5a50","Type":"ContainerStarted","Data":"6231234ec5e0e35f6f6b5890059692067c9c8bb6eebd9620815a08e62c89e841"} Jan 05 15:14:23 crc kubenswrapper[4740]: I0105 15:14:23.396378 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" event={"ID":"e85987fd-4f1a-4e66-bfab-53cd215c5a50","Type":"ContainerStarted","Data":"6d08baa59f9c5a391990f2a6051b52acc30f14308f2312591214f093296f3329"} Jan 05 15:14:23 crc kubenswrapper[4740]: I0105 15:14:23.438382 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" podStartSLOduration=2.965170672 podStartE2EDuration="10.438360067s" podCreationTimestamp="2026-01-05 15:14:13 +0000 UTC" firstStartedPulling="2026-01-05 15:14:15.15397404 +0000 UTC m=+5104.460882619" lastFinishedPulling="2026-01-05 15:14:22.627163435 +0000 UTC m=+5111.934072014" observedRunningTime="2026-01-05 15:14:23.41936478 +0000 UTC m=+5112.726273379" watchObservedRunningTime="2026-01-05 15:14:23.438360067 +0000 UTC m=+5112.745268646" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.608203 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-xrqrf"] Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.611302 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.666917 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22b2e838-f61a-45b7-b582-de4427c875a0-host\") pod \"crc-debug-xrqrf\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.667123 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqhh\" (UniqueName: \"kubernetes.io/projected/22b2e838-f61a-45b7-b582-de4427c875a0-kube-api-access-djqhh\") pod \"crc-debug-xrqrf\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.769131 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22b2e838-f61a-45b7-b582-de4427c875a0-host\") pod \"crc-debug-xrqrf\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.769353 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqhh\" (UniqueName: \"kubernetes.io/projected/22b2e838-f61a-45b7-b582-de4427c875a0-kube-api-access-djqhh\") pod \"crc-debug-xrqrf\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.769612 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22b2e838-f61a-45b7-b582-de4427c875a0-host\") pod \"crc-debug-xrqrf\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.789329 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqhh\" (UniqueName: \"kubernetes.io/projected/22b2e838-f61a-45b7-b582-de4427c875a0-kube-api-access-djqhh\") pod \"crc-debug-xrqrf\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:27 crc kubenswrapper[4740]: I0105 15:14:27.931903 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:14:29 crc kubenswrapper[4740]: I0105 15:14:29.480634 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" event={"ID":"22b2e838-f61a-45b7-b582-de4427c875a0","Type":"ContainerStarted","Data":"556919aa1b93ec5d00d9d172ba4b2e8f414825db18e8c45fe4489b55b2ea1bba"} Jan 05 15:14:40 crc kubenswrapper[4740]: I0105 15:14:40.602700 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" event={"ID":"22b2e838-f61a-45b7-b582-de4427c875a0","Type":"ContainerStarted","Data":"6ec97acc85ca9bd08d151c784d2f13739c092ceb742b59a4e982aa62a2b7ffa6"} Jan 05 15:14:40 crc kubenswrapper[4740]: I0105 15:14:40.623872 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" podStartSLOduration=2.923928885 podStartE2EDuration="13.623856711s" podCreationTimestamp="2026-01-05 15:14:27 +0000 UTC" firstStartedPulling="2026-01-05 15:14:28.553312426 +0000 UTC m=+5117.860221035" lastFinishedPulling="2026-01-05 15:14:39.253240282 +0000 UTC m=+5128.560148861" observedRunningTime="2026-01-05 15:14:40.619990037 +0000 UTC m=+5129.926898616" watchObservedRunningTime="2026-01-05 15:14:40.623856711 +0000 UTC m=+5129.930765290" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.168954 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862"] Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.171481 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.173799 4740 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.173962 4740 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.271725 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862"] Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.277514 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f00979a-073c-4681-baba-6c6eab07b907-config-volume\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.277778 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwkvf\" (UniqueName: \"kubernetes.io/projected/9f00979a-073c-4681-baba-6c6eab07b907-kube-api-access-fwkvf\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.277968 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f00979a-073c-4681-baba-6c6eab07b907-secret-volume\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.380594 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f00979a-073c-4681-baba-6c6eab07b907-secret-volume\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.381010 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f00979a-073c-4681-baba-6c6eab07b907-config-volume\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.381342 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwkvf\" (UniqueName: \"kubernetes.io/projected/9f00979a-073c-4681-baba-6c6eab07b907-kube-api-access-fwkvf\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.382226 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f00979a-073c-4681-baba-6c6eab07b907-config-volume\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.734039 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwkvf\" (UniqueName: \"kubernetes.io/projected/9f00979a-073c-4681-baba-6c6eab07b907-kube-api-access-fwkvf\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.734220 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f00979a-073c-4681-baba-6c6eab07b907-secret-volume\") pod \"collect-profiles-29460435-jt862\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:00 crc kubenswrapper[4740]: I0105 15:15:00.795847 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:01 crc kubenswrapper[4740]: I0105 15:15:01.418841 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862"] Jan 05 15:15:01 crc kubenswrapper[4740]: I0105 15:15:01.852159 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" event={"ID":"9f00979a-073c-4681-baba-6c6eab07b907","Type":"ContainerStarted","Data":"c676f35fc03dd5749b27d6677fada591cd450f3814ef6fff44b382283243b895"} Jan 05 15:15:01 crc kubenswrapper[4740]: I0105 15:15:01.852686 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" event={"ID":"9f00979a-073c-4681-baba-6c6eab07b907","Type":"ContainerStarted","Data":"1a5741f1ff6cccebf7e4c7f00064eb43809d8a3fad14c8f48fce56403331c54f"} Jan 05 15:15:01 crc kubenswrapper[4740]: I0105 15:15:01.871862 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" podStartSLOduration=1.871840344 podStartE2EDuration="1.871840344s" podCreationTimestamp="2026-01-05 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-05 15:15:01.867660451 +0000 UTC m=+5151.174569030" watchObservedRunningTime="2026-01-05 15:15:01.871840344 +0000 UTC m=+5151.178748923" Jan 05 15:15:02 crc kubenswrapper[4740]: I0105 15:15:02.870235 4740 generic.go:334] "Generic (PLEG): container finished" podID="9f00979a-073c-4681-baba-6c6eab07b907" containerID="c676f35fc03dd5749b27d6677fada591cd450f3814ef6fff44b382283243b895" exitCode=0 Jan 05 15:15:02 crc kubenswrapper[4740]: I0105 15:15:02.870541 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" event={"ID":"9f00979a-073c-4681-baba-6c6eab07b907","Type":"ContainerDied","Data":"c676f35fc03dd5749b27d6677fada591cd450f3814ef6fff44b382283243b895"} Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.482382 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.581392 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwkvf\" (UniqueName: \"kubernetes.io/projected/9f00979a-073c-4681-baba-6c6eab07b907-kube-api-access-fwkvf\") pod \"9f00979a-073c-4681-baba-6c6eab07b907\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.582098 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f00979a-073c-4681-baba-6c6eab07b907-config-volume\") pod \"9f00979a-073c-4681-baba-6c6eab07b907\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.582181 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f00979a-073c-4681-baba-6c6eab07b907-secret-volume\") pod \"9f00979a-073c-4681-baba-6c6eab07b907\" (UID: \"9f00979a-073c-4681-baba-6c6eab07b907\") " Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.582939 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f00979a-073c-4681-baba-6c6eab07b907-config-volume" (OuterVolumeSpecName: "config-volume") pod "9f00979a-073c-4681-baba-6c6eab07b907" (UID: "9f00979a-073c-4681-baba-6c6eab07b907"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.583456 4740 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f00979a-073c-4681-baba-6c6eab07b907-config-volume\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.588229 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f00979a-073c-4681-baba-6c6eab07b907-kube-api-access-fwkvf" (OuterVolumeSpecName: "kube-api-access-fwkvf") pod "9f00979a-073c-4681-baba-6c6eab07b907" (UID: "9f00979a-073c-4681-baba-6c6eab07b907"). InnerVolumeSpecName "kube-api-access-fwkvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.596593 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f00979a-073c-4681-baba-6c6eab07b907-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9f00979a-073c-4681-baba-6c6eab07b907" (UID: "9f00979a-073c-4681-baba-6c6eab07b907"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.685854 4740 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f00979a-073c-4681-baba-6c6eab07b907-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.685892 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwkvf\" (UniqueName: \"kubernetes.io/projected/9f00979a-073c-4681-baba-6c6eab07b907-kube-api-access-fwkvf\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.921482 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" event={"ID":"9f00979a-073c-4681-baba-6c6eab07b907","Type":"ContainerDied","Data":"1a5741f1ff6cccebf7e4c7f00064eb43809d8a3fad14c8f48fce56403331c54f"} Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.921539 4740 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5741f1ff6cccebf7e4c7f00064eb43809d8a3fad14c8f48fce56403331c54f" Jan 05 15:15:04 crc kubenswrapper[4740]: I0105 15:15:04.921607 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29460435-jt862" Jan 05 15:15:05 crc kubenswrapper[4740]: I0105 15:15:05.575552 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4"] Jan 05 15:15:05 crc kubenswrapper[4740]: I0105 15:15:05.588199 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29460390-bbdh4"] Jan 05 15:15:06 crc kubenswrapper[4740]: I0105 15:15:06.981784 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74ca1f5-6afe-4622-9306-f779d6f7dda3" path="/var/lib/kubelet/pods/a74ca1f5-6afe-4622-9306-f779d6f7dda3/volumes" Jan 05 15:15:27 crc kubenswrapper[4740]: I0105 15:15:27.208272 4740 generic.go:334] "Generic (PLEG): container finished" podID="22b2e838-f61a-45b7-b582-de4427c875a0" containerID="6ec97acc85ca9bd08d151c784d2f13739c092ceb742b59a4e982aa62a2b7ffa6" exitCode=0 Jan 05 15:15:27 crc kubenswrapper[4740]: I0105 15:15:27.208407 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" event={"ID":"22b2e838-f61a-45b7-b582-de4427c875a0","Type":"ContainerDied","Data":"6ec97acc85ca9bd08d151c784d2f13739c092ceb742b59a4e982aa62a2b7ffa6"} Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.343446 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.380721 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22b2e838-f61a-45b7-b582-de4427c875a0-host\") pod \"22b2e838-f61a-45b7-b582-de4427c875a0\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.380846 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22b2e838-f61a-45b7-b582-de4427c875a0-host" (OuterVolumeSpecName: "host") pod "22b2e838-f61a-45b7-b582-de4427c875a0" (UID: "22b2e838-f61a-45b7-b582-de4427c875a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.381011 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djqhh\" (UniqueName: \"kubernetes.io/projected/22b2e838-f61a-45b7-b582-de4427c875a0-kube-api-access-djqhh\") pod \"22b2e838-f61a-45b7-b582-de4427c875a0\" (UID: \"22b2e838-f61a-45b7-b582-de4427c875a0\") " Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.381815 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22b2e838-f61a-45b7-b582-de4427c875a0-host\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.390050 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-xrqrf"] Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.390347 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b2e838-f61a-45b7-b582-de4427c875a0-kube-api-access-djqhh" (OuterVolumeSpecName: "kube-api-access-djqhh") pod "22b2e838-f61a-45b7-b582-de4427c875a0" (UID: "22b2e838-f61a-45b7-b582-de4427c875a0"). InnerVolumeSpecName "kube-api-access-djqhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.400616 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-xrqrf"] Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.483893 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djqhh\" (UniqueName: \"kubernetes.io/projected/22b2e838-f61a-45b7-b582-de4427c875a0-kube-api-access-djqhh\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:28 crc kubenswrapper[4740]: I0105 15:15:28.981304 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b2e838-f61a-45b7-b582-de4427c875a0" path="/var/lib/kubelet/pods/22b2e838-f61a-45b7-b582-de4427c875a0/volumes" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.232659 4740 scope.go:117] "RemoveContainer" containerID="6ec97acc85ca9bd08d151c784d2f13739c092ceb742b59a4e982aa62a2b7ffa6" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.232742 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-xrqrf" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.606716 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-m6vfc"] Jan 05 15:15:29 crc kubenswrapper[4740]: E0105 15:15:29.607371 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f00979a-073c-4681-baba-6c6eab07b907" containerName="collect-profiles" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.607389 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f00979a-073c-4681-baba-6c6eab07b907" containerName="collect-profiles" Jan 05 15:15:29 crc kubenswrapper[4740]: E0105 15:15:29.607418 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b2e838-f61a-45b7-b582-de4427c875a0" containerName="container-00" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.607426 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b2e838-f61a-45b7-b582-de4427c875a0" containerName="container-00" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.607768 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f00979a-073c-4681-baba-6c6eab07b907" containerName="collect-profiles" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.607789 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b2e838-f61a-45b7-b582-de4427c875a0" containerName="container-00" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.608751 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.711446 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-host\") pod \"crc-debug-m6vfc\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.711612 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drt2g\" (UniqueName: \"kubernetes.io/projected/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-kube-api-access-drt2g\") pod \"crc-debug-m6vfc\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.814782 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drt2g\" (UniqueName: \"kubernetes.io/projected/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-kube-api-access-drt2g\") pod \"crc-debug-m6vfc\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.815593 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-host\") pod \"crc-debug-m6vfc\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.815915 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-host\") pod \"crc-debug-m6vfc\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.833330 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drt2g\" (UniqueName: \"kubernetes.io/projected/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-kube-api-access-drt2g\") pod \"crc-debug-m6vfc\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:29 crc kubenswrapper[4740]: I0105 15:15:29.931256 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:30 crc kubenswrapper[4740]: I0105 15:15:30.255640 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" event={"ID":"dbdc0cea-738d-4d0b-9b49-74caf7562a7f","Type":"ContainerStarted","Data":"e503bd58048d6737b19ec4c62c54798a5566c1daf6ffd0c7fa337f6d290d18cd"} Jan 05 15:15:31 crc kubenswrapper[4740]: I0105 15:15:31.270549 4740 generic.go:334] "Generic (PLEG): container finished" podID="dbdc0cea-738d-4d0b-9b49-74caf7562a7f" containerID="510ae65adee3521404c8170b8615dcfa24c8e3e5ec73449334aa43478cdd3624" exitCode=0 Jan 05 15:15:31 crc kubenswrapper[4740]: I0105 15:15:31.270649 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" event={"ID":"dbdc0cea-738d-4d0b-9b49-74caf7562a7f","Type":"ContainerDied","Data":"510ae65adee3521404c8170b8615dcfa24c8e3e5ec73449334aa43478cdd3624"} Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.397654 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.469825 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drt2g\" (UniqueName: \"kubernetes.io/projected/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-kube-api-access-drt2g\") pod \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.470387 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-host\") pod \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\" (UID: \"dbdc0cea-738d-4d0b-9b49-74caf7562a7f\") " Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.471238 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-host" (OuterVolumeSpecName: "host") pod "dbdc0cea-738d-4d0b-9b49-74caf7562a7f" (UID: "dbdc0cea-738d-4d0b-9b49-74caf7562a7f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.476153 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-kube-api-access-drt2g" (OuterVolumeSpecName: "kube-api-access-drt2g") pod "dbdc0cea-738d-4d0b-9b49-74caf7562a7f" (UID: "dbdc0cea-738d-4d0b-9b49-74caf7562a7f"). InnerVolumeSpecName "kube-api-access-drt2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.572660 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drt2g\" (UniqueName: \"kubernetes.io/projected/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-kube-api-access-drt2g\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.572692 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbdc0cea-738d-4d0b-9b49-74caf7562a7f-host\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.821467 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-m6vfc"] Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.833938 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-m6vfc"] Jan 05 15:15:32 crc kubenswrapper[4740]: I0105 15:15:32.986725 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdc0cea-738d-4d0b-9b49-74caf7562a7f" path="/var/lib/kubelet/pods/dbdc0cea-738d-4d0b-9b49-74caf7562a7f/volumes" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.320695 4740 scope.go:117] "RemoveContainer" containerID="510ae65adee3521404c8170b8615dcfa24c8e3e5ec73449334aa43478cdd3624" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.320964 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-m6vfc" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.698592 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gdmb4"] Jan 05 15:15:33 crc kubenswrapper[4740]: E0105 15:15:33.699397 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdc0cea-738d-4d0b-9b49-74caf7562a7f" containerName="container-00" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.699411 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdc0cea-738d-4d0b-9b49-74caf7562a7f" containerName="container-00" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.699644 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdc0cea-738d-4d0b-9b49-74caf7562a7f" containerName="container-00" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.701668 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.721155 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdmb4"] Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.806973 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5l7c\" (UniqueName: \"kubernetes.io/projected/97dcf77b-7942-452b-81f6-ed513d674b38-kube-api-access-d5l7c\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.807019 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-utilities\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.807042 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-catalog-content\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.909311 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5l7c\" (UniqueName: \"kubernetes.io/projected/97dcf77b-7942-452b-81f6-ed513d674b38-kube-api-access-d5l7c\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.909365 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-utilities\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.909404 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-catalog-content\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.910165 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-catalog-content\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.910743 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-utilities\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:33 crc kubenswrapper[4740]: I0105 15:15:33.939082 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5l7c\" (UniqueName: \"kubernetes.io/projected/97dcf77b-7942-452b-81f6-ed513d674b38-kube-api-access-d5l7c\") pod \"certified-operators-gdmb4\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.021597 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.032854 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-88h4g"] Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.034375 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.115087 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlbh\" (UniqueName: \"kubernetes.io/projected/81385175-565f-4b61-88f8-78b38467cfef-kube-api-access-krlbh\") pod \"crc-debug-88h4g\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.115228 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81385175-565f-4b61-88f8-78b38467cfef-host\") pod \"crc-debug-88h4g\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.218535 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81385175-565f-4b61-88f8-78b38467cfef-host\") pod \"crc-debug-88h4g\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.219028 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlbh\" (UniqueName: \"kubernetes.io/projected/81385175-565f-4b61-88f8-78b38467cfef-kube-api-access-krlbh\") pod \"crc-debug-88h4g\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.219475 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81385175-565f-4b61-88f8-78b38467cfef-host\") pod \"crc-debug-88h4g\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.236669 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlbh\" (UniqueName: \"kubernetes.io/projected/81385175-565f-4b61-88f8-78b38467cfef-kube-api-access-krlbh\") pod \"crc-debug-88h4g\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.457383 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:34 crc kubenswrapper[4740]: W0105 15:15:34.496048 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81385175_565f_4b61_88f8_78b38467cfef.slice/crio-fec23e1132bef3e749a21d2c33cd1b95963a653515f75c0241065dceb7d76d7d WatchSource:0}: Error finding container fec23e1132bef3e749a21d2c33cd1b95963a653515f75c0241065dceb7d76d7d: Status 404 returned error can't find the container with id fec23e1132bef3e749a21d2c33cd1b95963a653515f75c0241065dceb7d76d7d Jan 05 15:15:34 crc kubenswrapper[4740]: I0105 15:15:34.593247 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdmb4"] Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.151163 4740 scope.go:117] "RemoveContainer" containerID="e2357239d27bc636bc43697a42f98c8365edc9e8fb1eb001919d63a3fc4372e0" Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.346371 4740 generic.go:334] "Generic (PLEG): container finished" podID="81385175-565f-4b61-88f8-78b38467cfef" containerID="538223d3a4d6d53964d7105f64ed2be1e308346d8efb6e3bed6c0035cb76ff91" exitCode=0 Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.346459 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-88h4g" event={"ID":"81385175-565f-4b61-88f8-78b38467cfef","Type":"ContainerDied","Data":"538223d3a4d6d53964d7105f64ed2be1e308346d8efb6e3bed6c0035cb76ff91"} Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.346520 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/crc-debug-88h4g" event={"ID":"81385175-565f-4b61-88f8-78b38467cfef","Type":"ContainerStarted","Data":"fec23e1132bef3e749a21d2c33cd1b95963a653515f75c0241065dceb7d76d7d"} Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.348492 4740 generic.go:334] "Generic (PLEG): container finished" podID="97dcf77b-7942-452b-81f6-ed513d674b38" containerID="74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e" exitCode=0 Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.348539 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerDied","Data":"74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e"} Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.348565 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerStarted","Data":"f556132be58ca24fb467266e8a1997f9a9ec0e53a02dc024b6ace86ab81b3b3f"} Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.383627 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-88h4g"] Jan 05 15:15:35 crc kubenswrapper[4740]: I0105 15:15:35.398034 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fpnwq/crc-debug-88h4g"] Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.376312 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerStarted","Data":"6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922"} Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.516449 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.582593 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krlbh\" (UniqueName: \"kubernetes.io/projected/81385175-565f-4b61-88f8-78b38467cfef-kube-api-access-krlbh\") pod \"81385175-565f-4b61-88f8-78b38467cfef\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.582734 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81385175-565f-4b61-88f8-78b38467cfef-host\") pod \"81385175-565f-4b61-88f8-78b38467cfef\" (UID: \"81385175-565f-4b61-88f8-78b38467cfef\") " Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.582826 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81385175-565f-4b61-88f8-78b38467cfef-host" (OuterVolumeSpecName: "host") pod "81385175-565f-4b61-88f8-78b38467cfef" (UID: "81385175-565f-4b61-88f8-78b38467cfef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.583591 4740 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81385175-565f-4b61-88f8-78b38467cfef-host\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.587874 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81385175-565f-4b61-88f8-78b38467cfef-kube-api-access-krlbh" (OuterVolumeSpecName: "kube-api-access-krlbh") pod "81385175-565f-4b61-88f8-78b38467cfef" (UID: "81385175-565f-4b61-88f8-78b38467cfef"). InnerVolumeSpecName "kube-api-access-krlbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.686282 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krlbh\" (UniqueName: \"kubernetes.io/projected/81385175-565f-4b61-88f8-78b38467cfef-kube-api-access-krlbh\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:36 crc kubenswrapper[4740]: I0105 15:15:36.981926 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81385175-565f-4b61-88f8-78b38467cfef" path="/var/lib/kubelet/pods/81385175-565f-4b61-88f8-78b38467cfef/volumes" Jan 05 15:15:37 crc kubenswrapper[4740]: I0105 15:15:37.386607 4740 scope.go:117] "RemoveContainer" containerID="538223d3a4d6d53964d7105f64ed2be1e308346d8efb6e3bed6c0035cb76ff91" Jan 05 15:15:37 crc kubenswrapper[4740]: I0105 15:15:37.386623 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/crc-debug-88h4g" Jan 05 15:15:38 crc kubenswrapper[4740]: I0105 15:15:38.403176 4740 generic.go:334] "Generic (PLEG): container finished" podID="97dcf77b-7942-452b-81f6-ed513d674b38" containerID="6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922" exitCode=0 Jan 05 15:15:38 crc kubenswrapper[4740]: I0105 15:15:38.403270 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerDied","Data":"6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922"} Jan 05 15:15:39 crc kubenswrapper[4740]: I0105 15:15:39.416943 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerStarted","Data":"225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2"} Jan 05 15:15:39 crc kubenswrapper[4740]: I0105 15:15:39.442205 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gdmb4" podStartSLOduration=2.914777023 podStartE2EDuration="6.442185627s" podCreationTimestamp="2026-01-05 15:15:33 +0000 UTC" firstStartedPulling="2026-01-05 15:15:35.350390514 +0000 UTC m=+5184.657299113" lastFinishedPulling="2026-01-05 15:15:38.877799138 +0000 UTC m=+5188.184707717" observedRunningTime="2026-01-05 15:15:39.442027963 +0000 UTC m=+5188.748936542" watchObservedRunningTime="2026-01-05 15:15:39.442185627 +0000 UTC m=+5188.749094206" Jan 05 15:15:44 crc kubenswrapper[4740]: I0105 15:15:44.022790 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:44 crc kubenswrapper[4740]: I0105 15:15:44.023258 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:44 crc kubenswrapper[4740]: I0105 15:15:44.080798 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:44 crc kubenswrapper[4740]: I0105 15:15:44.562282 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:44 crc kubenswrapper[4740]: I0105 15:15:44.637586 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdmb4"] Jan 05 15:15:46 crc kubenswrapper[4740]: I0105 15:15:46.525034 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gdmb4" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="registry-server" containerID="cri-o://225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2" gracePeriod=2 Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.195691 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.259380 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-utilities\") pod \"97dcf77b-7942-452b-81f6-ed513d674b38\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.259462 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-catalog-content\") pod \"97dcf77b-7942-452b-81f6-ed513d674b38\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.259499 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5l7c\" (UniqueName: \"kubernetes.io/projected/97dcf77b-7942-452b-81f6-ed513d674b38-kube-api-access-d5l7c\") pod \"97dcf77b-7942-452b-81f6-ed513d674b38\" (UID: \"97dcf77b-7942-452b-81f6-ed513d674b38\") " Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.261524 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-utilities" (OuterVolumeSpecName: "utilities") pod "97dcf77b-7942-452b-81f6-ed513d674b38" (UID: "97dcf77b-7942-452b-81f6-ed513d674b38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.269479 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97dcf77b-7942-452b-81f6-ed513d674b38-kube-api-access-d5l7c" (OuterVolumeSpecName: "kube-api-access-d5l7c") pod "97dcf77b-7942-452b-81f6-ed513d674b38" (UID: "97dcf77b-7942-452b-81f6-ed513d674b38"). InnerVolumeSpecName "kube-api-access-d5l7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.323532 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97dcf77b-7942-452b-81f6-ed513d674b38" (UID: "97dcf77b-7942-452b-81f6-ed513d674b38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.361223 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.361254 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97dcf77b-7942-452b-81f6-ed513d674b38-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.361270 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5l7c\" (UniqueName: \"kubernetes.io/projected/97dcf77b-7942-452b-81f6-ed513d674b38-kube-api-access-d5l7c\") on node \"crc\" DevicePath \"\"" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.540349 4740 generic.go:334] "Generic (PLEG): container finished" podID="97dcf77b-7942-452b-81f6-ed513d674b38" containerID="225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2" exitCode=0 Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.540695 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdmb4" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.541177 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerDied","Data":"225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2"} Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.541228 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdmb4" event={"ID":"97dcf77b-7942-452b-81f6-ed513d674b38","Type":"ContainerDied","Data":"f556132be58ca24fb467266e8a1997f9a9ec0e53a02dc024b6ace86ab81b3b3f"} Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.541247 4740 scope.go:117] "RemoveContainer" containerID="225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.574136 4740 scope.go:117] "RemoveContainer" containerID="6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.579074 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdmb4"] Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.590969 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gdmb4"] Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.605760 4740 scope.go:117] "RemoveContainer" containerID="74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.660722 4740 scope.go:117] "RemoveContainer" containerID="225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2" Jan 05 15:15:47 crc kubenswrapper[4740]: E0105 15:15:47.661427 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2\": container with ID starting with 225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2 not found: ID does not exist" containerID="225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.661458 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2"} err="failed to get container status \"225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2\": rpc error: code = NotFound desc = could not find container \"225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2\": container with ID starting with 225ca236fa1e28d49635a12e66eb5aeb512869416382e537b37029254bd1acf2 not found: ID does not exist" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.661479 4740 scope.go:117] "RemoveContainer" containerID="6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922" Jan 05 15:15:47 crc kubenswrapper[4740]: E0105 15:15:47.662405 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922\": container with ID starting with 6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922 not found: ID does not exist" containerID="6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.662433 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922"} err="failed to get container status \"6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922\": rpc error: code = NotFound desc = could not find container \"6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922\": container with ID starting with 6f9626a168ab94fd47194af65fb80fd8fff159f4d5e4c189c8822dee31581922 not found: ID does not exist" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.662449 4740 scope.go:117] "RemoveContainer" containerID="74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e" Jan 05 15:15:47 crc kubenswrapper[4740]: E0105 15:15:47.667025 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e\": container with ID starting with 74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e not found: ID does not exist" containerID="74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e" Jan 05 15:15:47 crc kubenswrapper[4740]: I0105 15:15:47.667052 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e"} err="failed to get container status \"74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e\": rpc error: code = NotFound desc = could not find container \"74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e\": container with ID starting with 74c41fe8cd193cbd2d7a390cda0c0de461115dba3d4e4de88d9158d9cd7cb04e not found: ID does not exist" Jan 05 15:15:48 crc kubenswrapper[4740]: I0105 15:15:48.983463 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" path="/var/lib/kubelet/pods/97dcf77b-7942-452b-81f6-ed513d674b38/volumes" Jan 05 15:16:04 crc kubenswrapper[4740]: I0105 15:16:04.707823 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0622aaac-01e2-4f75-8de9-49db879c1703/aodh-api/0.log" Jan 05 15:16:04 crc kubenswrapper[4740]: I0105 15:16:04.969118 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0622aaac-01e2-4f75-8de9-49db879c1703/aodh-evaluator/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.028628 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0622aaac-01e2-4f75-8de9-49db879c1703/aodh-listener/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.064846 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0622aaac-01e2-4f75-8de9-49db879c1703/aodh-notifier/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.160307 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dc98bb458-s9tqv_1f5ea36a-57ff-4d4e-ac3d-914d2278cd96/barbican-api/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.242315 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dc98bb458-s9tqv_1f5ea36a-57ff-4d4e-ac3d-914d2278cd96/barbican-api-log/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.382043 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c988b84c4-q8wdn_98684691-8ef2-4cb5-85e2-fe5913a5b3c0/barbican-keystone-listener/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.580992 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c988b84c4-q8wdn_98684691-8ef2-4cb5-85e2-fe5913a5b3c0/barbican-keystone-listener-log/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.614019 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f99fddc9f-j57sh_6e339bc8-cbcb-48ac-84bd-ad37fcd552c0/barbican-worker/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.688497 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f99fddc9f-j57sh_6e339bc8-cbcb-48ac-84bd-ad37fcd552c0/barbican-worker-log/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.781821 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xzm4t"] Jan 05 15:16:05 crc kubenswrapper[4740]: E0105 15:16:05.782495 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="extract-content" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.782515 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="extract-content" Jan 05 15:16:05 crc kubenswrapper[4740]: E0105 15:16:05.782570 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="registry-server" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.782579 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="registry-server" Jan 05 15:16:05 crc kubenswrapper[4740]: E0105 15:16:05.782599 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81385175-565f-4b61-88f8-78b38467cfef" containerName="container-00" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.782607 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="81385175-565f-4b61-88f8-78b38467cfef" containerName="container-00" Jan 05 15:16:05 crc kubenswrapper[4740]: E0105 15:16:05.782637 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="extract-utilities" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.782646 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="extract-utilities" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.782922 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="97dcf77b-7942-452b-81f6-ed513d674b38" containerName="registry-server" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.782977 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="81385175-565f-4b61-88f8-78b38467cfef" containerName="container-00" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.784795 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.824643 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzm4t"] Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.909446 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-997gx_50749d1a-ca02-4a0c-8f55-522f5d53497a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.951621 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn7hl\" (UniqueName: \"kubernetes.io/projected/9a024072-a186-4b1c-8bba-1500ed0608a5-kube-api-access-cn7hl\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.951810 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-utilities\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:05 crc kubenswrapper[4740]: I0105 15:16:05.952136 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-catalog-content\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.024574 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b922e17-0ca9-49cd-8af7-b78776b990bb/ceilometer-central-agent/1.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.054485 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-catalog-content\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.054546 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn7hl\" (UniqueName: \"kubernetes.io/projected/9a024072-a186-4b1c-8bba-1500ed0608a5-kube-api-access-cn7hl\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.054656 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-utilities\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.055098 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-catalog-content\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.055144 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-utilities\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.079467 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn7hl\" (UniqueName: \"kubernetes.io/projected/9a024072-a186-4b1c-8bba-1500ed0608a5-kube-api-access-cn7hl\") pod \"redhat-marketplace-xzm4t\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.112478 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.167550 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b922e17-0ca9-49cd-8af7-b78776b990bb/ceilometer-central-agent/0.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.204840 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b922e17-0ca9-49cd-8af7-b78776b990bb/ceilometer-notification-agent/0.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.303295 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b922e17-0ca9-49cd-8af7-b78776b990bb/sg-core/0.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.312684 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2b922e17-0ca9-49cd-8af7-b78776b990bb/proxy-httpd/0.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.588242 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b6206351-05ef-4835-9e76-0002c5eca516/cinder-api/0.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.596389 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b6206351-05ef-4835-9e76-0002c5eca516/cinder-api-log/0.log" Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.700859 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzm4t"] Jan 05 15:16:06 crc kubenswrapper[4740]: I0105 15:16:06.862110 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzm4t" event={"ID":"9a024072-a186-4b1c-8bba-1500ed0608a5","Type":"ContainerStarted","Data":"e25d9401b5a24e70bfc88d09423424bf1a72814ee286ee98d5db0770d1986952"} Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.014215 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_623b4799-3a7c-4cba-8632-420ef0704992/cinder-scheduler/1.log" Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.246931 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_623b4799-3a7c-4cba-8632-420ef0704992/cinder-scheduler/0.log" Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.289609 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_623b4799-3a7c-4cba-8632-420ef0704992/probe/0.log" Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.361914 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7d5kk_dc645406-ab7f-4676-a442-373f2251f8d1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.887547 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-z745v_43c57365-103e-453b-9fd9-685adcc47850/init/0.log" Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.890933 4740 generic.go:334] "Generic (PLEG): container finished" podID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerID="e2a41f5d9f9482f7372b6ce7ba6442025d52b45f80956035630ebe53c95183db" exitCode=0 Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.891141 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzm4t" event={"ID":"9a024072-a186-4b1c-8bba-1500ed0608a5","Type":"ContainerDied","Data":"e2a41f5d9f9482f7372b6ce7ba6442025d52b45f80956035630ebe53c95183db"} Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.915304 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 15:16:07 crc kubenswrapper[4740]: I0105 15:16:07.960901 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gbpsz_aa2b5d30-c6dd-4d84-b33f-b3e855e61b24/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:08 crc kubenswrapper[4740]: I0105 15:16:08.233577 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-z745v_43c57365-103e-453b-9fd9-685adcc47850/init/0.log" Jan 05 15:16:08 crc kubenswrapper[4740]: I0105 15:16:08.495276 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-f4kg2_611be36b-14e8-4639-a2a5-f1ed357cfc34/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:08 crc kubenswrapper[4740]: I0105 15:16:08.585728 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-z745v_43c57365-103e-453b-9fd9-685adcc47850/dnsmasq-dns/0.log" Jan 05 15:16:08 crc kubenswrapper[4740]: I0105 15:16:08.830051 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f4ba45a8-a488-4b50-8d75-13a2e63dbac8/glance-httpd/0.log" Jan 05 15:16:08 crc kubenswrapper[4740]: I0105 15:16:08.845740 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f4ba45a8-a488-4b50-8d75-13a2e63dbac8/glance-log/0.log" Jan 05 15:16:09 crc kubenswrapper[4740]: I0105 15:16:09.143860 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b4741b20-74ae-4c1d-b8ea-f9d0579f0b20/glance-httpd/0.log" Jan 05 15:16:09 crc kubenswrapper[4740]: I0105 15:16:09.192282 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b4741b20-74ae-4c1d-b8ea-f9d0579f0b20/glance-log/0.log" Jan 05 15:16:09 crc kubenswrapper[4740]: I0105 15:16:09.680888 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5996fc5cd9-vd2l5_5598f372-3948-4c7b-8f77-f8305802d248/heat-api/0.log" Jan 05 15:16:09 crc kubenswrapper[4740]: I0105 15:16:09.809021 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6ff48d7c64-5zmkg_efa1a5de-d25c-4be0-b921-2cd12446d1dc/heat-engine/0.log" Jan 05 15:16:09 crc kubenswrapper[4740]: I0105 15:16:09.986871 4740 generic.go:334] "Generic (PLEG): container finished" podID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerID="6cbc3e586cbd63df9b65e1402082f2c5126da4b793e483d47ffce09d275acddb" exitCode=0 Jan 05 15:16:09 crc kubenswrapper[4740]: I0105 15:16:09.987135 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzm4t" event={"ID":"9a024072-a186-4b1c-8bba-1500ed0608a5","Type":"ContainerDied","Data":"6cbc3e586cbd63df9b65e1402082f2c5126da4b793e483d47ffce09d275acddb"} Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.034344 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rdrh9_e2a521bb-96d4-48cf-bb3a-cb2d48b2edcb/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.067407 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xtb2g_fd7ca708-94a1-4016-a666-8a9b1eb34a62/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.127011 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5cd669fcc8-pkmkd_e19ab652-e210-4428-84aa-8f04d156a4fb/heat-cfnapi/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.538160 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6d1b6a83-0692-4de9-8d5f-56f4371b9d22/kube-state-metrics/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.609173 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29460421-bppnn_c1324171-d9ac-4ee9-8fae-36557d38ad3e/keystone-cron/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.722452 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b7fccd64c-7x22h_9a37e8fa-bd7c-4449-84d9-45067f3abff7/keystone-api/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.822158 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tpzz5_7fd8586e-a676-4220-a8b8-1b75b6d9a789/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:10 crc kubenswrapper[4740]: I0105 15:16:10.947491 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-k56qg_f6735580-cec3-4c58-82ed-37c1b38ba74c/logging-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:11 crc kubenswrapper[4740]: I0105 15:16:11.023239 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzm4t" event={"ID":"9a024072-a186-4b1c-8bba-1500ed0608a5","Type":"ContainerStarted","Data":"561e5fa0d29682dd9a5aa9cd0667205eddca22f8bcc9f3f655aac5c6f7e7059e"} Jan 05 15:16:11 crc kubenswrapper[4740]: I0105 15:16:11.055156 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xzm4t" podStartSLOduration=3.475920915 podStartE2EDuration="6.055137445s" podCreationTimestamp="2026-01-05 15:16:05 +0000 UTC" firstStartedPulling="2026-01-05 15:16:07.910722844 +0000 UTC m=+5217.217631423" lastFinishedPulling="2026-01-05 15:16:10.489939374 +0000 UTC m=+5219.796847953" observedRunningTime="2026-01-05 15:16:11.044646914 +0000 UTC m=+5220.351555493" watchObservedRunningTime="2026-01-05 15:16:11.055137445 +0000 UTC m=+5220.362046024" Jan 05 15:16:11 crc kubenswrapper[4740]: I0105 15:16:11.161177 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_4b6027cc-1afc-468f-9a91-6c1f3f844ba2/mysqld-exporter/0.log" Jan 05 15:16:11 crc kubenswrapper[4740]: I0105 15:16:11.670320 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5797c5bd9-frslw_713e1044-70d2-45c5-a6e7-dc2fcc24ed54/neutron-api/0.log" Jan 05 15:16:11 crc kubenswrapper[4740]: I0105 15:16:11.697972 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5797c5bd9-frslw_713e1044-70d2-45c5-a6e7-dc2fcc24ed54/neutron-httpd/0.log" Jan 05 15:16:11 crc kubenswrapper[4740]: I0105 15:16:11.834183 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vlf65_5d3a3a3d-0940-4e1f-91cd-2720ec3d7867/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:12 crc kubenswrapper[4740]: I0105 15:16:12.541167 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_03e6abd6-0882-4930-90c6-d2ba9ded2d49/nova-api-log/0.log" Jan 05 15:16:12 crc kubenswrapper[4740]: I0105 15:16:12.851758 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_20a12d4e-701a-4535-9755-79b170faffa6/nova-cell0-conductor-conductor/0.log" Jan 05 15:16:12 crc kubenswrapper[4740]: I0105 15:16:12.865405 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_03e6abd6-0882-4930-90c6-d2ba9ded2d49/nova-api-api/0.log" Jan 05 15:16:13 crc kubenswrapper[4740]: I0105 15:16:13.216362 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2ad0da6b-b746-4181-b51a-913721d967e3/nova-cell1-novncproxy-novncproxy/0.log" Jan 05 15:16:13 crc kubenswrapper[4740]: I0105 15:16:13.249085 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4a384d1a-1719-4e7c-b0fc-7a15242711f0/nova-cell1-conductor-conductor/0.log" Jan 05 15:16:13 crc kubenswrapper[4740]: I0105 15:16:13.473902 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-xg2sb_0a84d8e6-06d0-4645-b42f-a77963c58987/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:13 crc kubenswrapper[4740]: I0105 15:16:13.622307 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd6f1475-5188-4831-91b3-e6741b308a2e/nova-metadata-log/0.log" Jan 05 15:16:14 crc kubenswrapper[4740]: I0105 15:16:14.046650 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_043aa2af-a684-451e-a812-cf42bd753490/nova-scheduler-scheduler/0.log" Jan 05 15:16:14 crc kubenswrapper[4740]: I0105 15:16:14.980659 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_261c319a-37da-4987-a774-ecc24fa6b083/mysql-bootstrap/0.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.131701 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_261c319a-37da-4987-a774-ecc24fa6b083/mysql-bootstrap/0.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.171187 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_261c319a-37da-4987-a774-ecc24fa6b083/galera/1.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.468305 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dbb5263b-e98b-48a4-825e-ffb99738059f/mysql-bootstrap/0.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.475736 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_261c319a-37da-4987-a774-ecc24fa6b083/galera/0.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.669582 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dbb5263b-e98b-48a4-825e-ffb99738059f/mysql-bootstrap/0.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.850003 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dbb5263b-e98b-48a4-825e-ffb99738059f/galera/1.log" Jan 05 15:16:15 crc kubenswrapper[4740]: I0105 15:16:15.887714 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_dbb5263b-e98b-48a4-825e-ffb99738059f/galera/0.log" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.018323 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd6f1475-5188-4831-91b3-e6741b308a2e/nova-metadata-metadata/0.log" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.112619 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.115246 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.127215 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6605f3c2-8a9f-45d4-9060-8e44e4bccac6/openstackclient/0.log" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.172140 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.247525 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hr9pm_da00ff66-0241-449f-9ceb-9c9849d5f646/ovn-controller/0.log" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.379208 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xbfj5_2ca95a0e-df45-403b-b278-5701175ac8e1/openstack-network-exporter/0.log" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.562699 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zgtw8_9a6073e2-c1cc-4ecb-982c-7e872257e83d/ovsdb-server-init/0.log" Jan 05 15:16:16 crc kubenswrapper[4740]: I0105 15:16:16.775818 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zgtw8_9a6073e2-c1cc-4ecb-982c-7e872257e83d/ovsdb-server-init/0.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.131501 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zgtw8_9a6073e2-c1cc-4ecb-982c-7e872257e83d/ovs-vswitchd/0.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.184308 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.221037 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zgtw8_9a6073e2-c1cc-4ecb-982c-7e872257e83d/ovsdb-server/0.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.255917 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzm4t"] Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.358979 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dl27j_237ed6e7-19dd-4f03-8da9-6c43db4798c6/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.414549 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf526b0a-998c-4943-bfba-04352421ed58/openstack-network-exporter/0.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.522439 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf526b0a-998c-4943-bfba-04352421ed58/ovn-northd/1.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.604152 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf526b0a-998c-4943-bfba-04352421ed58/ovn-northd/0.log" Jan 05 15:16:17 crc kubenswrapper[4740]: I0105 15:16:17.717493 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_81bea285-8670-4e86-a0a0-df327d8cf009/openstack-network-exporter/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.000437 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_81bea285-8670-4e86-a0a0-df327d8cf009/ovsdbserver-nb/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.009345 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b66d1fba-b858-4c18-9ddc-d329614afe92/ovsdbserver-sb/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.011749 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b66d1fba-b858-4c18-9ddc-d329614afe92/openstack-network-exporter/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.414922 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a87008bf-295c-4343-a6b2-f3fd37fa581d/init-config-reloader/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.434514 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85c7d9bc6-f85sn_802f617e-6b4c-4e6e-aeab-11fa157041b9/placement-api/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.463628 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85c7d9bc6-f85sn_802f617e-6b4c-4e6e-aeab-11fa157041b9/placement-log/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.695213 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a87008bf-295c-4343-a6b2-f3fd37fa581d/init-config-reloader/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.695294 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a87008bf-295c-4343-a6b2-f3fd37fa581d/prometheus/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.719296 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a87008bf-295c-4343-a6b2-f3fd37fa581d/thanos-sidecar/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.751816 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_a87008bf-295c-4343-a6b2-f3fd37fa581d/config-reloader/0.log" Jan 05 15:16:18 crc kubenswrapper[4740]: I0105 15:16:18.918955 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8a70451d-dcc2-4bce-81b1-e1f6291eac3b/setup-container/0.log" Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.124120 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xzm4t" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="registry-server" containerID="cri-o://561e5fa0d29682dd9a5aa9cd0667205eddca22f8bcc9f3f655aac5c6f7e7059e" gracePeriod=2 Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.283364 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8a70451d-dcc2-4bce-81b1-e1f6291eac3b/setup-container/0.log" Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.290534 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8a70451d-dcc2-4bce-81b1-e1f6291eac3b/rabbitmq/0.log" Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.407966 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e/setup-container/0.log" Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.719047 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e/setup-container/0.log" Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.728486 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0d6e0a8-dbcd-40b8-874e-6c5c1d76183e/rabbitmq/0.log" Jan 05 15:16:19 crc kubenswrapper[4740]: I0105 15:16:19.819604 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_93d7209e-3012-42db-a76c-cd020634e3c4/setup-container/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.177661 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8150ac8e-6303-4af1-8a21-6fb434df508b/setup-container/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.184852 4740 generic.go:334] "Generic (PLEG): container finished" podID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerID="561e5fa0d29682dd9a5aa9cd0667205eddca22f8bcc9f3f655aac5c6f7e7059e" exitCode=0 Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.184897 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzm4t" event={"ID":"9a024072-a186-4b1c-8bba-1500ed0608a5","Type":"ContainerDied","Data":"561e5fa0d29682dd9a5aa9cd0667205eddca22f8bcc9f3f655aac5c6f7e7059e"} Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.244805 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_93d7209e-3012-42db-a76c-cd020634e3c4/rabbitmq/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.262544 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_93d7209e-3012-42db-a76c-cd020634e3c4/setup-container/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.342843 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.472295 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-utilities\") pod \"9a024072-a186-4b1c-8bba-1500ed0608a5\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.472375 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-catalog-content\") pod \"9a024072-a186-4b1c-8bba-1500ed0608a5\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.472484 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn7hl\" (UniqueName: \"kubernetes.io/projected/9a024072-a186-4b1c-8bba-1500ed0608a5-kube-api-access-cn7hl\") pod \"9a024072-a186-4b1c-8bba-1500ed0608a5\" (UID: \"9a024072-a186-4b1c-8bba-1500ed0608a5\") " Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.473832 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-utilities" (OuterVolumeSpecName: "utilities") pod "9a024072-a186-4b1c-8bba-1500ed0608a5" (UID: "9a024072-a186-4b1c-8bba-1500ed0608a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.474471 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.484239 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a024072-a186-4b1c-8bba-1500ed0608a5-kube-api-access-cn7hl" (OuterVolumeSpecName: "kube-api-access-cn7hl") pod "9a024072-a186-4b1c-8bba-1500ed0608a5" (UID: "9a024072-a186-4b1c-8bba-1500ed0608a5"). InnerVolumeSpecName "kube-api-access-cn7hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.515904 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a024072-a186-4b1c-8bba-1500ed0608a5" (UID: "9a024072-a186-4b1c-8bba-1500ed0608a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.546044 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8150ac8e-6303-4af1-8a21-6fb434df508b/setup-container/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.580579 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a024072-a186-4b1c-8bba-1500ed0608a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.580611 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn7hl\" (UniqueName: \"kubernetes.io/projected/9a024072-a186-4b1c-8bba-1500ed0608a5-kube-api-access-cn7hl\") on node \"crc\" DevicePath \"\"" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.581618 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8150ac8e-6303-4af1-8a21-6fb434df508b/rabbitmq/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.594551 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-5mbcw_14251f6a-60c2-4493-9899-d61cf7f1a907/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.870673 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-xkn8j_35e15f1f-4511-493c-8e96-8446fb0b7b14/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:20 crc kubenswrapper[4740]: I0105 15:16:20.966532 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-flmts_d20a172f-298d-47ac-a858-cd1600b65c4e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.116650 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-4xzrm_616aed46-7a36-4531-afca-003df9bfb20d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.216819 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzm4t" event={"ID":"9a024072-a186-4b1c-8bba-1500ed0608a5","Type":"ContainerDied","Data":"e25d9401b5a24e70bfc88d09423424bf1a72814ee286ee98d5db0770d1986952"} Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.216885 4740 scope.go:117] "RemoveContainer" containerID="561e5fa0d29682dd9a5aa9cd0667205eddca22f8bcc9f3f655aac5c6f7e7059e" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.217103 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzm4t" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.267820 4740 scope.go:117] "RemoveContainer" containerID="6cbc3e586cbd63df9b65e1402082f2c5126da4b793e483d47ffce09d275acddb" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.302582 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzm4t"] Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.326352 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzm4t"] Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.358264 4740 scope.go:117] "RemoveContainer" containerID="e2a41f5d9f9482f7372b6ce7ba6442025d52b45f80956035630ebe53c95183db" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.582050 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-lwc8v_c0946f8e-100f-4ceb-9766-2254ec001229/ssh-known-hosts-edpm-deployment/0.log" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.775619 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d5f88dcf-flfmp_981fd275-b6a0-4023-b7a0-17427043fab4/proxy-server/0.log" Jan 05 15:16:21 crc kubenswrapper[4740]: I0105 15:16:21.994161 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d5f88dcf-flfmp_981fd275-b6a0-4023-b7a0-17427043fab4/proxy-httpd/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.014959 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qvtqr_a24dd80e-d5b8-4449-96f3-d2682acd78c8/swift-ring-rebalance/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.023963 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_204a8fe1-f0b3-4b24-9496-ee5c800200d8/memcached/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.135753 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/account-auditor/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.198644 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/account-reaper/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.210433 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/account-replicator/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.288058 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/container-auditor/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.307710 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/account-server/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.391559 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/container-replicator/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.407243 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/container-server/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.431448 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/container-updater/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.514613 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/object-auditor/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.548542 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/object-expirer/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.639396 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/object-replicator/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.662088 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/object-server/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.683797 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/object-updater/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.737351 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/rsync/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.777535 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e1b97919-20e0-4eb9-a60b-0f52a4d7c73b/swift-recon-cron/0.log" Jan 05 15:16:22 crc kubenswrapper[4740]: I0105 15:16:22.989007 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" path="/var/lib/kubelet/pods/9a024072-a186-4b1c-8bba-1500ed0608a5/volumes" Jan 05 15:16:23 crc kubenswrapper[4740]: I0105 15:16:23.884717 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-sxmt4_6f1d62a2-396b-4081-a429-95ccbc8c8764/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:23 crc kubenswrapper[4740]: I0105 15:16:23.924757 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-vfgrv_f2e8ef4b-ba8e-46af-b20d-f19af317419c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:24 crc kubenswrapper[4740]: I0105 15:16:24.096461 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_990a893a-383c-432f-822e-a3da59d882c1/test-operator-logs-container/0.log" Jan 05 15:16:24 crc kubenswrapper[4740]: I0105 15:16:24.200055 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s6hxn_8f16ba41-c679-41ca-80e4-508f95ff78ca/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 05 15:16:24 crc kubenswrapper[4740]: I0105 15:16:24.411599 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_efc1648a-7270-45c6-af93-bd4b641931d2/tempest-tests-tempest-tests-runner/0.log" Jan 05 15:16:31 crc kubenswrapper[4740]: I0105 15:16:31.916437 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:16:31 crc kubenswrapper[4740]: I0105 15:16:31.917040 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:16:52 crc kubenswrapper[4740]: I0105 15:16:52.997301 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/util/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.223633 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/pull/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.256720 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/pull/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.325073 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/util/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.467744 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/util/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.474898 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/pull/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.475383 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_10af4f86459b0243e6c9ab9ac5509a621c5dcd9d310a999ff7198c45bfgd7dw_5f093e0b-c437-4ab1-8e0d-ff7408f72e14/extract/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.729335 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f6f74d6db-jqccs_3868391b-95fe-40be-a77d-593ea72fd786/manager/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.738391 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78979fc445-5dgxw_a07332cc-11af-4d3a-8761-891417586bd1/manager/0.log" Jan 05 15:16:53 crc kubenswrapper[4740]: I0105 15:16:53.868114 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-hfcj9_01f58b56-275e-432c-aecc-f9853194f0fd/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.044358 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7b549fc966-7jlfq_d8ffad98-ed22-4c4c-b0b8-234c3358089e/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.179924 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-658dd65b86-lz62l_e2dc84c3-c204-4f17-bcf3-418ab17b873d/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.210877 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7f5ddd8d7b-skl95_61306334-5c80-4b48-8c47-bbc9a26f5ef3/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.460691 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f99f54bc8-9s2d8_cb54fddc-8710-4066-908b-bb7a00a15c7e/manager/1.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.504668 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f99f54bc8-9s2d8_cb54fddc-8710-4066-908b-bb7a00a15c7e/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.728832 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6d99759cf-mglsd_3f0a6bbe-32b3-4e4d-afef-32e871616c6d/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.769534 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-568985c78-7sdkl_d4245835-8bf3-4491-9e66-a456d2fea83d/manager/0.log" Jan 05 15:16:54 crc kubenswrapper[4740]: I0105 15:16:54.917300 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-598945d5b8-bn857_ca259c15-4c6d-4142-b257-12e805385d3f/manager/0.log" Jan 05 15:16:55 crc kubenswrapper[4740]: I0105 15:16:55.022874 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b88bfc995-vqb8k_b73e8f21-70b9-4f4a-b96b-f255e80db992/manager/0.log" Jan 05 15:16:55 crc kubenswrapper[4740]: I0105 15:16:55.240816 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-szhmd_35577830-7016-49ad-bae0-8a9962a2e82c/manager/0.log" Jan 05 15:16:55 crc kubenswrapper[4740]: I0105 15:16:55.371438 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-2hj8b_9d82f35e-307a-4ed2-89e0-0649e5300e41/manager/0.log" Jan 05 15:16:55 crc kubenswrapper[4740]: I0105 15:16:55.407763 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-8tvrz_da2bbca8-6d81-4fd9-b2a6-e52f98b7fb29/manager/0.log" Jan 05 15:16:55 crc kubenswrapper[4740]: I0105 15:16:55.547911 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-78948ddfd7m88wc_6faf05ee-49e0-4d3e-afcd-d11d9494da44/manager/0.log" Jan 05 15:16:55 crc kubenswrapper[4740]: I0105 15:16:55.974512 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xlq6v_3a862b07-6296-43d8-8aff-2a6fdf1bd898/registry-server/1.log" Jan 05 15:16:56 crc kubenswrapper[4740]: I0105 15:16:56.105497 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6d46c7d5f9-sm4zx_97dce6b2-fc01-4ced-a77a-a506dcb06eff/operator/0.log" Jan 05 15:16:56 crc kubenswrapper[4740]: I0105 15:16:56.117424 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xlq6v_3a862b07-6296-43d8-8aff-2a6fdf1bd898/registry-server/0.log" Jan 05 15:16:56 crc kubenswrapper[4740]: I0105 15:16:56.385141 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-8qngq_71602a38-6096-4224-b49a-adfccfe02180/manager/0.log" Jan 05 15:16:56 crc kubenswrapper[4740]: I0105 15:16:56.482092 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-9b6f8f78c-6wdrm_ab509865-4e08-4927-b702-f28bfb553a27/manager/0.log" Jan 05 15:16:56 crc kubenswrapper[4740]: I0105 15:16:56.662744 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6ppzp_ed1081fd-322e-4e98-aa00-e5aeef21b7b3/operator/0.log" Jan 05 15:16:56 crc kubenswrapper[4740]: I0105 15:16:56.975201 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bb586bbf4-vvlkq_d1228a5b-52ed-4d7e-940e-b4b03288fae5/manager/0.log" Jan 05 15:16:57 crc kubenswrapper[4740]: I0105 15:16:57.263900 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-779f597f97-v7z84_49bbef73-8653-4747-93ee-35819a394b1f/manager/0.log" Jan 05 15:16:57 crc kubenswrapper[4740]: I0105 15:16:57.277403 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6c866cfdcb-jbqnp_c5e3ed99-183e-41f6-bbee-d5c8e7f629d1/manager/0.log" Jan 05 15:16:57 crc kubenswrapper[4740]: I0105 15:16:57.516743 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7dc6b6df78-rkxtw_c7f964a1-5e19-4cb6-8e25-26fdc09410af/manager/0.log" Jan 05 15:16:57 crc kubenswrapper[4740]: I0105 15:16:57.916893 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-9dbdf6486-tfjlm_b8ae6035-6986-4e15-ac19-6e093c0a9e7a/manager/0.log" Jan 05 15:17:01 crc kubenswrapper[4740]: I0105 15:17:01.916055 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:17:01 crc kubenswrapper[4740]: I0105 15:17:01.916633 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:17:19 crc kubenswrapper[4740]: I0105 15:17:19.822750 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kkdl5_d7d9be6f-7415-4c1a-a6b9-6f5222d5580f/control-plane-machine-set-operator/0.log" Jan 05 15:17:20 crc kubenswrapper[4740]: I0105 15:17:20.066510 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7v62c_97b4920c-fd39-4ef5-96b6-a044e3440f62/kube-rbac-proxy/0.log" Jan 05 15:17:20 crc kubenswrapper[4740]: I0105 15:17:20.083965 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7v62c_97b4920c-fd39-4ef5-96b6-a044e3440f62/machine-api-operator/0.log" Jan 05 15:17:31 crc kubenswrapper[4740]: I0105 15:17:31.916401 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:17:31 crc kubenswrapper[4740]: I0105 15:17:31.916913 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:17:31 crc kubenswrapper[4740]: I0105 15:17:31.916951 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 15:17:31 crc kubenswrapper[4740]: I0105 15:17:31.918145 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 15:17:31 crc kubenswrapper[4740]: I0105 15:17:31.918274 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" gracePeriod=600 Jan 05 15:17:32 crc kubenswrapper[4740]: E0105 15:17:32.062580 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:17:32 crc kubenswrapper[4740]: I0105 15:17:32.154315 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" exitCode=0 Jan 05 15:17:32 crc kubenswrapper[4740]: I0105 15:17:32.154356 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b"} Jan 05 15:17:32 crc kubenswrapper[4740]: I0105 15:17:32.154390 4740 scope.go:117] "RemoveContainer" containerID="5b8a6b87265d54ac4ffe376ba3cafb17fa6677039615a1c5d95bd77a60f25bca" Jan 05 15:17:32 crc kubenswrapper[4740]: I0105 15:17:32.155252 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:17:32 crc kubenswrapper[4740]: E0105 15:17:32.155647 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:17:33 crc kubenswrapper[4740]: I0105 15:17:33.939185 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-swj9b_bd3cc742-4514-481b-8eef-dc428aae4320/cert-manager-controller/0.log" Jan 05 15:17:34 crc kubenswrapper[4740]: I0105 15:17:34.365946 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-gpw7d_de715cf7-9b2d-4980-96da-f3b7d489b142/cert-manager-cainjector/0.log" Jan 05 15:17:34 crc kubenswrapper[4740]: I0105 15:17:34.529257 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zc946_237e57c0-4e0a-4c9f-89a2-e5a84fb41d22/cert-manager-webhook/1.log" Jan 05 15:17:34 crc kubenswrapper[4740]: I0105 15:17:34.569910 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zc946_237e57c0-4e0a-4c9f-89a2-e5a84fb41d22/cert-manager-webhook/0.log" Jan 05 15:17:43 crc kubenswrapper[4740]: I0105 15:17:43.968949 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:17:43 crc kubenswrapper[4740]: E0105 15:17:43.970816 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:17:49 crc kubenswrapper[4740]: I0105 15:17:49.646290 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-b7fh2_f49ae0c4-aa16-448b-859d-45ed8809ac9d/nmstate-console-plugin/0.log" Jan 05 15:17:49 crc kubenswrapper[4740]: I0105 15:17:49.860203 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-qpc9m_7b892142-9e83-40e5-b305-28c1c3dde6d5/kube-rbac-proxy/0.log" Jan 05 15:17:49 crc kubenswrapper[4740]: I0105 15:17:49.864313 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-58hgk_a4bfce1d-9af3-49f5-877e-b5ea29088ac7/nmstate-handler/0.log" Jan 05 15:17:49 crc kubenswrapper[4740]: I0105 15:17:49.985630 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-qpc9m_7b892142-9e83-40e5-b305-28c1c3dde6d5/nmstate-metrics/0.log" Jan 05 15:17:50 crc kubenswrapper[4740]: I0105 15:17:50.077305 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-bg7t9_49e4bc2c-7786-4f56-8ec8-650c1a388d67/nmstate-operator/0.log" Jan 05 15:17:50 crc kubenswrapper[4740]: I0105 15:17:50.175997 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-zvv7k_e6df01b0-f4c2-49c2-982d-4b814fd5d493/nmstate-webhook/0.log" Jan 05 15:17:55 crc kubenswrapper[4740]: I0105 15:17:55.968569 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:17:55 crc kubenswrapper[4740]: E0105 15:17:55.969546 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:18:05 crc kubenswrapper[4740]: I0105 15:18:05.685346 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56d45b676b-q44gh_50147f9c-3a52-4e0e-b0cc-1fd94e7def10/kube-rbac-proxy/0.log" Jan 05 15:18:05 crc kubenswrapper[4740]: I0105 15:18:05.769548 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56d45b676b-q44gh_50147f9c-3a52-4e0e-b0cc-1fd94e7def10/manager/1.log" Jan 05 15:18:05 crc kubenswrapper[4740]: I0105 15:18:05.910028 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56d45b676b-q44gh_50147f9c-3a52-4e0e-b0cc-1fd94e7def10/manager/0.log" Jan 05 15:18:06 crc kubenswrapper[4740]: I0105 15:18:06.968087 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:18:06 crc kubenswrapper[4740]: E0105 15:18:06.968340 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:18:19 crc kubenswrapper[4740]: I0105 15:18:19.970114 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:18:19 crc kubenswrapper[4740]: E0105 15:18:19.970868 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:18:22 crc kubenswrapper[4740]: I0105 15:18:22.484561 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-79cf69ddc8-mzrfj_018b2164-91a6-498e-bcfb-481d36067d93/cluster-logging-operator/0.log" Jan 05 15:18:22 crc kubenswrapper[4740]: I0105 15:18:22.683172 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-bx9tf_ed4602b2-acd5-4357-98b4-b0b016dc8a61/collector/0.log" Jan 05 15:18:22 crc kubenswrapper[4740]: I0105 15:18:22.756497 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_8aa8f13a-7e90-4195-a5d7-0a9ca16edd7c/loki-compactor/0.log" Jan 05 15:18:22 crc kubenswrapper[4740]: I0105 15:18:22.854283 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5f678c8dd6-bdcsh_7927742d-54ad-4fbb-841a-71d40648d88e/loki-distributor/0.log" Jan 05 15:18:22 crc kubenswrapper[4740]: I0105 15:18:22.968236 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-66cd7bf4cd-8svgw_fd41c01f-dff3-4b6a-ae38-8b114b384a59/gateway/0.log" Jan 05 15:18:23 crc kubenswrapper[4740]: I0105 15:18:23.034164 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-66cd7bf4cd-8svgw_fd41c01f-dff3-4b6a-ae38-8b114b384a59/opa/0.log" Jan 05 15:18:23 crc kubenswrapper[4740]: I0105 15:18:23.946869 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-66cd7bf4cd-8vc2p_1e34aa71-7c05-4606-a71a-2c5b20667ba1/gateway/0.log" Jan 05 15:18:23 crc kubenswrapper[4740]: I0105 15:18:23.961979 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_572c5991-5ab4-4786-9641-cc8f3ff4bd21/loki-index-gateway/0.log" Jan 05 15:18:23 crc kubenswrapper[4740]: I0105 15:18:23.970514 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-66cd7bf4cd-8vc2p_1e34aa71-7c05-4606-a71a-2c5b20667ba1/opa/0.log" Jan 05 15:18:24 crc kubenswrapper[4740]: I0105 15:18:24.206313 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76788598db-n78fw_5c35e5ba-6f22-4fbd-b3d9-e7ebe1980d41/loki-querier/0.log" Jan 05 15:18:24 crc kubenswrapper[4740]: I0105 15:18:24.206751 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_7a870fe5-0a84-4e0b-9798-7fe2adb8f6cb/loki-ingester/0.log" Jan 05 15:18:24 crc kubenswrapper[4740]: I0105 15:18:24.394289 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-69d9546745-lbmjc_d8fd9857-fca2-4041-9c72-3747c84b6987/loki-query-frontend/0.log" Jan 05 15:18:32 crc kubenswrapper[4740]: I0105 15:18:32.969555 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:18:32 crc kubenswrapper[4740]: E0105 15:18:32.970393 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.186445 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-pgtqj_22a24e17-6432-4f77-a553-47f9de4d68e4/kube-rbac-proxy/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.351716 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-pgtqj_22a24e17-6432-4f77-a553-47f9de4d68e4/controller/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.436478 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-frr-files/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.645825 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-reloader/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.663190 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-metrics/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.664883 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-frr-files/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.686886 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-reloader/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.884019 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-frr-files/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.903248 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-metrics/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.935825 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-reloader/0.log" Jan 05 15:18:39 crc kubenswrapper[4740]: I0105 15:18:39.947857 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-metrics/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.114740 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-frr-files/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.146876 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-metrics/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.155711 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/cp-reloader/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.188550 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/controller/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.355536 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/frr/1.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.368099 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/frr-metrics/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.464272 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/kube-rbac-proxy/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.643661 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/kube-rbac-proxy-frr/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.675862 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/reloader/0.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.909348 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-qhkx7_afd48a06-614a-4fed-8629-a1a2eb83ab80/frr-k8s-webhook-server/1.log" Jan 05 15:18:40 crc kubenswrapper[4740]: I0105 15:18:40.942410 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-qhkx7_afd48a06-614a-4fed-8629-a1a2eb83ab80/frr-k8s-webhook-server/0.log" Jan 05 15:18:41 crc kubenswrapper[4740]: I0105 15:18:41.213585 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d9475cc54-vwc79_939f9254-d7a5-48cb-8ab1-7ea2e4f68610/webhook-server/1.log" Jan 05 15:18:41 crc kubenswrapper[4740]: I0105 15:18:41.225845 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-65b6fb4bb9-pfr54_8de8b15c-984a-4c5f-a956-5f4244da97ef/manager/0.log" Jan 05 15:18:41 crc kubenswrapper[4740]: I0105 15:18:41.411104 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d9475cc54-vwc79_939f9254-d7a5-48cb-8ab1-7ea2e4f68610/webhook-server/0.log" Jan 05 15:18:41 crc kubenswrapper[4740]: I0105 15:18:41.467586 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vp8lm_ba42b606-5ea8-4d51-a695-dc563937f304/kube-rbac-proxy/0.log" Jan 05 15:18:41 crc kubenswrapper[4740]: I0105 15:18:41.983167 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ffh7k_3c26b9da-cc4e-44dc-92ba-92e42b962010/frr/0.log" Jan 05 15:18:42 crc kubenswrapper[4740]: I0105 15:18:42.181824 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vp8lm_ba42b606-5ea8-4d51-a695-dc563937f304/speaker/0.log" Jan 05 15:18:45 crc kubenswrapper[4740]: I0105 15:18:45.969109 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:18:45 crc kubenswrapper[4740]: E0105 15:18:45.969855 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.074080 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/util/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.228143 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/util/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.263514 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/pull/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.282173 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/pull/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.451594 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/util/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.473547 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/extract/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.488383 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2vxhz4_e82c700b-18bd-40e7-a546-67dcf6484bc9/pull/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.635515 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/util/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.803866 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/pull/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.809088 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/util/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.888535 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/pull/0.log" Jan 05 15:18:56 crc kubenswrapper[4740]: I0105 15:18:56.968791 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:18:56 crc kubenswrapper[4740]: E0105 15:18:56.969296 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.013616 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/extract/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.014420 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/util/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.026215 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bg27jh_912ddca7-b1d4-4430-8b31-36519503d33e/pull/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.186148 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/util/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.390631 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/pull/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.390752 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/pull/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.433312 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/util/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.680907 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/pull/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.683604 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/util/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.689211 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d442j84_c471fc2a-21ef-4027-b9ce-5d335f5735f2/extract/0.log" Jan 05 15:18:57 crc kubenswrapper[4740]: I0105 15:18:57.864364 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/util/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.079809 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/util/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.081348 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/pull/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.115415 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/pull/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.307709 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/pull/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.331205 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/extract/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.336719 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8qng94_0dc34484-b121-476e-8aa8-e969485032b5/util/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.506792 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/util/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.672002 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/util/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.692941 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/pull/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.714883 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/pull/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.915282 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/pull/0.log" Jan 05 15:18:58 crc kubenswrapper[4740]: I0105 15:18:58.958794 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/util/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.009347 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085jv8p_f127a860-4853-4359-9cef-ec66add405e3/extract/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.097413 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/extract-utilities/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.288460 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/extract-utilities/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.288542 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/extract-content/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.296133 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/extract-content/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.523583 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/extract-utilities/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.532357 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/extract-content/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.569340 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/extract-utilities/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.739715 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/extract-content/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.857862 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/extract-utilities/0.log" Jan 05 15:18:59 crc kubenswrapper[4740]: I0105 15:18:59.894284 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.027269 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dgjkq_02c01131-569c-43e4-b848-8d4b49a383d4/registry-server/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.033896 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.038687 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/extract-utilities/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.154175 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dwcfs_b023e432-3b4e-4161-bfcc-b5d8b601e9d5/registry-server/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.234187 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/extract-utilities/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.403453 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.404963 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/extract-utilities/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.405173 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.591562 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/extract-utilities/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.609154 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.661749 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/extract-utilities/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.827867 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfp42_f6e2c2ba-abe4-4d56-92c8-4a3f4f5d8879/registry-server/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.916740 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.917699 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/extract-content/0.log" Jan 05 15:19:00 crc kubenswrapper[4740]: I0105 15:19:00.919739 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/extract-utilities/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.108208 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/extract-content/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.117789 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/extract-utilities/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.161766 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/extract-utilities/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.416714 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/extract-utilities/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.473272 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/extract-content/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.548741 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/extract-content/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.706854 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/extract-utilities/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.733817 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/extract-content/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.920673 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgrks_6bf07ffe-6600-4ad4-b94c-3b7ac4a613a7/registry-server/0.log" Jan 05 15:19:01 crc kubenswrapper[4740]: I0105 15:19:01.954906 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/extract-utilities/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.057022 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-g9bwh_ceb39900-d5f8-4d29-b3ec-01a60b2e4378/registry-server/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.161014 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/extract-utilities/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.188213 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/extract-content/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.229939 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/extract-content/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.400739 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/extract-utilities/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.428338 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/extract-content/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.444328 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mrdpz_22d5567a-2314-42aa-b197-dac963dcbfd1/marketplace-operator/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.542159 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tj7tn_4df26838-83be-4000-b37e-841a0457717b/registry-server/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.623714 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/extract-utilities/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.878422 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/extract-utilities/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.879393 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/extract-content/0.log" Jan 05 15:19:02 crc kubenswrapper[4740]: I0105 15:19:02.911596 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/extract-content/0.log" Jan 05 15:19:03 crc kubenswrapper[4740]: I0105 15:19:03.736104 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/extract-utilities/0.log" Jan 05 15:19:03 crc kubenswrapper[4740]: I0105 15:19:03.804118 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/extract-content/0.log" Jan 05 15:19:03 crc kubenswrapper[4740]: I0105 15:19:03.875392 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/extract-utilities/0.log" Jan 05 15:19:03 crc kubenswrapper[4740]: I0105 15:19:03.883962 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8q5ns_4e38ea3c-6147-4049-a885-c9a247a5697c/registry-server/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.034688 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/extract-content/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.034688 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/extract-content/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.041716 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/extract-utilities/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.233260 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/extract-utilities/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.238781 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/extract-content/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.300015 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/extract-utilities/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.511183 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/extract-utilities/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.539210 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/extract-content/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.686542 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/extract-content/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.728426 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/extract-content/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.778137 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/extract-utilities/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.958924 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mkqxp_dc07eab5-3d3e-4da1-aff1-dc180039a90a/registry-server/0.log" Jan 05 15:19:04 crc kubenswrapper[4740]: I0105 15:19:04.992432 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7vp5p_836c7750-5680-4a56-8947-2df3b121bb3f/registry-server/0.log" Jan 05 15:19:10 crc kubenswrapper[4740]: I0105 15:19:10.983461 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:19:10 crc kubenswrapper[4740]: E0105 15:19:10.984384 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.180241 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-9jwt7_abe8ad9d-d5fe-46ef-8220-0d45b4f077b2/prometheus-operator/0.log" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.403526 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67b4fcdc66-hpfcw_f63eb54a-2bd6-4366-a206-095360d8b368/prometheus-operator-admission-webhook/0.log" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.447564 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67b4fcdc66-mctk4_c9f195d7-1f66-4278-98bf-4ed7bdbb42a1/prometheus-operator-admission-webhook/0.log" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.640335 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lfg5l_45f52c16-f526-4498-bc85-2aec3b292a60/operator/0.log" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.659103 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lfg5l_45f52c16-f526-4498-bc85-2aec3b292a60/operator/1.log" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.756766 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-p96xs_7d16bb19-c116-4398-9475-d7dbcfee470a/observability-ui-dashboards/0.log" Jan 05 15:19:19 crc kubenswrapper[4740]: I0105 15:19:19.889779 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-lpjfp_b13bfd05-5a88-449b-9d26-f11acf9c6bbf/perses-operator/0.log" Jan 05 15:19:24 crc kubenswrapper[4740]: I0105 15:19:24.968804 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:19:24 crc kubenswrapper[4740]: E0105 15:19:24.970330 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:19:34 crc kubenswrapper[4740]: I0105 15:19:34.816194 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56d45b676b-q44gh_50147f9c-3a52-4e0e-b0cc-1fd94e7def10/kube-rbac-proxy/0.log" Jan 05 15:19:34 crc kubenswrapper[4740]: I0105 15:19:34.824273 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56d45b676b-q44gh_50147f9c-3a52-4e0e-b0cc-1fd94e7def10/manager/1.log" Jan 05 15:19:34 crc kubenswrapper[4740]: I0105 15:19:34.869060 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56d45b676b-q44gh_50147f9c-3a52-4e0e-b0cc-1fd94e7def10/manager/0.log" Jan 05 15:19:37 crc kubenswrapper[4740]: I0105 15:19:37.969891 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:19:37 crc kubenswrapper[4740]: E0105 15:19:37.970917 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:19:48 crc kubenswrapper[4740]: I0105 15:19:48.970155 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:19:48 crc kubenswrapper[4740]: E0105 15:19:48.970963 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:19:58 crc kubenswrapper[4740]: E0105 15:19:58.910088 4740 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.97:37850->38.102.83.97:45101: write tcp 38.102.83.97:37850->38.102.83.97:45101: write: broken pipe Jan 05 15:20:03 crc kubenswrapper[4740]: I0105 15:20:03.968963 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:20:03 crc kubenswrapper[4740]: E0105 15:20:03.969724 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:20:18 crc kubenswrapper[4740]: I0105 15:20:18.970565 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:20:18 crc kubenswrapper[4740]: E0105 15:20:18.971175 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:20:25 crc kubenswrapper[4740]: I0105 15:20:25.274366 4740 trace.go:236] Trace[1861960804]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (05-Jan-2026 15:20:24.256) (total time: 1013ms): Jan 05 15:20:25 crc kubenswrapper[4740]: Trace[1861960804]: [1.013510026s] [1.013510026s] END Jan 05 15:20:31 crc kubenswrapper[4740]: I0105 15:20:31.968575 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:20:31 crc kubenswrapper[4740]: E0105 15:20:31.969588 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:20:42 crc kubenswrapper[4740]: I0105 15:20:42.969294 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:20:42 crc kubenswrapper[4740]: E0105 15:20:42.970354 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:20:55 crc kubenswrapper[4740]: I0105 15:20:55.968900 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:20:55 crc kubenswrapper[4740]: E0105 15:20:55.969959 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:21:06 crc kubenswrapper[4740]: I0105 15:21:06.968639 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:21:06 crc kubenswrapper[4740]: E0105 15:21:06.969345 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:21:20 crc kubenswrapper[4740]: I0105 15:21:20.969364 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:21:20 crc kubenswrapper[4740]: E0105 15:21:20.973267 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:21:33 crc kubenswrapper[4740]: I0105 15:21:33.186098 4740 generic.go:334] "Generic (PLEG): container finished" podID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerID="6d08baa59f9c5a391990f2a6051b52acc30f14308f2312591214f093296f3329" exitCode=0 Jan 05 15:21:33 crc kubenswrapper[4740]: I0105 15:21:33.186238 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" event={"ID":"e85987fd-4f1a-4e66-bfab-53cd215c5a50","Type":"ContainerDied","Data":"6d08baa59f9c5a391990f2a6051b52acc30f14308f2312591214f093296f3329"} Jan 05 15:21:33 crc kubenswrapper[4740]: I0105 15:21:33.187825 4740 scope.go:117] "RemoveContainer" containerID="6d08baa59f9c5a391990f2a6051b52acc30f14308f2312591214f093296f3329" Jan 05 15:21:34 crc kubenswrapper[4740]: I0105 15:21:34.275655 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fpnwq_must-gather-9qwfr_e85987fd-4f1a-4e66-bfab-53cd215c5a50/gather/0.log" Jan 05 15:21:34 crc kubenswrapper[4740]: I0105 15:21:34.969538 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:21:34 crc kubenswrapper[4740]: E0105 15:21:34.969890 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:21:41 crc kubenswrapper[4740]: I0105 15:21:41.940275 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fpnwq/must-gather-9qwfr"] Jan 05 15:21:41 crc kubenswrapper[4740]: I0105 15:21:41.942410 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="copy" containerID="cri-o://6231234ec5e0e35f6f6b5890059692067c9c8bb6eebd9620815a08e62c89e841" gracePeriod=2 Jan 05 15:21:41 crc kubenswrapper[4740]: I0105 15:21:41.955042 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fpnwq/must-gather-9qwfr"] Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.323240 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fpnwq_must-gather-9qwfr_e85987fd-4f1a-4e66-bfab-53cd215c5a50/copy/0.log" Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.323855 4740 generic.go:334] "Generic (PLEG): container finished" podID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerID="6231234ec5e0e35f6f6b5890059692067c9c8bb6eebd9620815a08e62c89e841" exitCode=143 Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.617418 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fpnwq_must-gather-9qwfr_e85987fd-4f1a-4e66-bfab-53cd215c5a50/copy/0.log" Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.618115 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.810439 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx997\" (UniqueName: \"kubernetes.io/projected/e85987fd-4f1a-4e66-bfab-53cd215c5a50-kube-api-access-kx997\") pod \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.811111 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85987fd-4f1a-4e66-bfab-53cd215c5a50-must-gather-output\") pod \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\" (UID: \"e85987fd-4f1a-4e66-bfab-53cd215c5a50\") " Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.822101 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85987fd-4f1a-4e66-bfab-53cd215c5a50-kube-api-access-kx997" (OuterVolumeSpecName: "kube-api-access-kx997") pod "e85987fd-4f1a-4e66-bfab-53cd215c5a50" (UID: "e85987fd-4f1a-4e66-bfab-53cd215c5a50"). InnerVolumeSpecName "kube-api-access-kx997". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:21:42 crc kubenswrapper[4740]: I0105 15:21:42.915695 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx997\" (UniqueName: \"kubernetes.io/projected/e85987fd-4f1a-4e66-bfab-53cd215c5a50-kube-api-access-kx997\") on node \"crc\" DevicePath \"\"" Jan 05 15:21:43 crc kubenswrapper[4740]: I0105 15:21:43.011202 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85987fd-4f1a-4e66-bfab-53cd215c5a50-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e85987fd-4f1a-4e66-bfab-53cd215c5a50" (UID: "e85987fd-4f1a-4e66-bfab-53cd215c5a50"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:21:43 crc kubenswrapper[4740]: I0105 15:21:43.017766 4740 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e85987fd-4f1a-4e66-bfab-53cd215c5a50-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 05 15:21:43 crc kubenswrapper[4740]: I0105 15:21:43.344411 4740 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fpnwq_must-gather-9qwfr_e85987fd-4f1a-4e66-bfab-53cd215c5a50/copy/0.log" Jan 05 15:21:43 crc kubenswrapper[4740]: I0105 15:21:43.344776 4740 scope.go:117] "RemoveContainer" containerID="6231234ec5e0e35f6f6b5890059692067c9c8bb6eebd9620815a08e62c89e841" Jan 05 15:21:43 crc kubenswrapper[4740]: I0105 15:21:43.344895 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fpnwq/must-gather-9qwfr" Jan 05 15:21:43 crc kubenswrapper[4740]: I0105 15:21:43.374923 4740 scope.go:117] "RemoveContainer" containerID="6d08baa59f9c5a391990f2a6051b52acc30f14308f2312591214f093296f3329" Jan 05 15:21:44 crc kubenswrapper[4740]: I0105 15:21:44.981762 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" path="/var/lib/kubelet/pods/e85987fd-4f1a-4e66-bfab-53cd215c5a50/volumes" Jan 05 15:21:47 crc kubenswrapper[4740]: I0105 15:21:47.968642 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:21:47 crc kubenswrapper[4740]: E0105 15:21:47.969390 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:22:02 crc kubenswrapper[4740]: I0105 15:22:02.969144 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:22:02 crc kubenswrapper[4740]: E0105 15:22:02.969771 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:22:17 crc kubenswrapper[4740]: I0105 15:22:17.968800 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:22:17 crc kubenswrapper[4740]: E0105 15:22:17.969687 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:22:28 crc kubenswrapper[4740]: I0105 15:22:28.969635 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:22:28 crc kubenswrapper[4740]: E0105 15:22:28.971846 4740 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xf724_openshift-machine-config-operator(7737db78-0989-433f-968a-7e5b441b7537)\"" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" Jan 05 15:22:41 crc kubenswrapper[4740]: I0105 15:22:41.969225 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:22:43 crc kubenswrapper[4740]: I0105 15:22:43.213303 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"10a41b1b661894249080ff040394f00a60665b6e5a3fd43b32aaeda3aea23239"} Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.854726 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-96zj9"] Jan 05 15:22:44 crc kubenswrapper[4740]: E0105 15:22:44.856022 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="gather" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856040 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="gather" Jan 05 15:22:44 crc kubenswrapper[4740]: E0105 15:22:44.856106 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="extract-content" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856115 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="extract-content" Jan 05 15:22:44 crc kubenswrapper[4740]: E0105 15:22:44.856133 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="copy" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856139 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="copy" Jan 05 15:22:44 crc kubenswrapper[4740]: E0105 15:22:44.856155 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="extract-utilities" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856161 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="extract-utilities" Jan 05 15:22:44 crc kubenswrapper[4740]: E0105 15:22:44.856173 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="registry-server" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856178 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="registry-server" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856616 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="copy" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856646 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85987fd-4f1a-4e66-bfab-53cd215c5a50" containerName="gather" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.856659 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a024072-a186-4b1c-8bba-1500ed0608a5" containerName="registry-server" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.862254 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:44 crc kubenswrapper[4740]: I0105 15:22:44.895955 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96zj9"] Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.002379 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68npb\" (UniqueName: \"kubernetes.io/projected/af1438ed-8c8c-4842-a32b-0e19371debe8-kube-api-access-68npb\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.002661 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-utilities\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.003151 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-catalog-content\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.105753 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-catalog-content\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.105865 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68npb\" (UniqueName: \"kubernetes.io/projected/af1438ed-8c8c-4842-a32b-0e19371debe8-kube-api-access-68npb\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.105960 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-utilities\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.106475 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-utilities\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.106743 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-catalog-content\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.129416 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68npb\" (UniqueName: \"kubernetes.io/projected/af1438ed-8c8c-4842-a32b-0e19371debe8-kube-api-access-68npb\") pod \"redhat-operators-96zj9\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.191558 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:45 crc kubenswrapper[4740]: I0105 15:22:45.725384 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96zj9"] Jan 05 15:22:46 crc kubenswrapper[4740]: I0105 15:22:46.247456 4740 generic.go:334] "Generic (PLEG): container finished" podID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerID="abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6" exitCode=0 Jan 05 15:22:46 crc kubenswrapper[4740]: I0105 15:22:46.247564 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerDied","Data":"abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6"} Jan 05 15:22:46 crc kubenswrapper[4740]: I0105 15:22:46.247729 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerStarted","Data":"582d0b6eb6afe332f9493af76bee62d05b1d5ca336966602cd271be16b5dbc87"} Jan 05 15:22:46 crc kubenswrapper[4740]: I0105 15:22:46.250433 4740 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 05 15:22:48 crc kubenswrapper[4740]: I0105 15:22:48.272414 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerStarted","Data":"8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4"} Jan 05 15:22:51 crc kubenswrapper[4740]: I0105 15:22:51.312413 4740 generic.go:334] "Generic (PLEG): container finished" podID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerID="8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4" exitCode=0 Jan 05 15:22:51 crc kubenswrapper[4740]: I0105 15:22:51.312516 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerDied","Data":"8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4"} Jan 05 15:22:52 crc kubenswrapper[4740]: I0105 15:22:52.326775 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerStarted","Data":"bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b"} Jan 05 15:22:52 crc kubenswrapper[4740]: I0105 15:22:52.358998 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-96zj9" podStartSLOduration=2.824554911 podStartE2EDuration="8.35886691s" podCreationTimestamp="2026-01-05 15:22:44 +0000 UTC" firstStartedPulling="2026-01-05 15:22:46.249257237 +0000 UTC m=+5615.556165816" lastFinishedPulling="2026-01-05 15:22:51.783569236 +0000 UTC m=+5621.090477815" observedRunningTime="2026-01-05 15:22:52.342788855 +0000 UTC m=+5621.649697454" watchObservedRunningTime="2026-01-05 15:22:52.35886691 +0000 UTC m=+5621.665775489" Jan 05 15:22:55 crc kubenswrapper[4740]: I0105 15:22:55.191944 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:55 crc kubenswrapper[4740]: I0105 15:22:55.192705 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:22:56 crc kubenswrapper[4740]: I0105 15:22:56.246617 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-96zj9" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="registry-server" probeResult="failure" output=< Jan 05 15:22:56 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:22:56 crc kubenswrapper[4740]: > Jan 05 15:23:06 crc kubenswrapper[4740]: I0105 15:23:06.244352 4740 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-96zj9" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="registry-server" probeResult="failure" output=< Jan 05 15:23:06 crc kubenswrapper[4740]: timeout: failed to connect service ":50051" within 1s Jan 05 15:23:06 crc kubenswrapper[4740]: > Jan 05 15:23:15 crc kubenswrapper[4740]: I0105 15:23:15.788949 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:23:15 crc kubenswrapper[4740]: I0105 15:23:15.851838 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:23:16 crc kubenswrapper[4740]: I0105 15:23:16.039782 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96zj9"] Jan 05 15:23:17 crc kubenswrapper[4740]: I0105 15:23:17.614554 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-96zj9" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="registry-server" containerID="cri-o://bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b" gracePeriod=2 Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.239224 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.384443 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68npb\" (UniqueName: \"kubernetes.io/projected/af1438ed-8c8c-4842-a32b-0e19371debe8-kube-api-access-68npb\") pod \"af1438ed-8c8c-4842-a32b-0e19371debe8\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.384516 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-utilities\") pod \"af1438ed-8c8c-4842-a32b-0e19371debe8\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.384906 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-catalog-content\") pod \"af1438ed-8c8c-4842-a32b-0e19371debe8\" (UID: \"af1438ed-8c8c-4842-a32b-0e19371debe8\") " Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.385394 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-utilities" (OuterVolumeSpecName: "utilities") pod "af1438ed-8c8c-4842-a32b-0e19371debe8" (UID: "af1438ed-8c8c-4842-a32b-0e19371debe8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.385822 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.394258 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1438ed-8c8c-4842-a32b-0e19371debe8-kube-api-access-68npb" (OuterVolumeSpecName: "kube-api-access-68npb") pod "af1438ed-8c8c-4842-a32b-0e19371debe8" (UID: "af1438ed-8c8c-4842-a32b-0e19371debe8"). InnerVolumeSpecName "kube-api-access-68npb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.485316 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af1438ed-8c8c-4842-a32b-0e19371debe8" (UID: "af1438ed-8c8c-4842-a32b-0e19371debe8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.488639 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af1438ed-8c8c-4842-a32b-0e19371debe8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.488670 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68npb\" (UniqueName: \"kubernetes.io/projected/af1438ed-8c8c-4842-a32b-0e19371debe8-kube-api-access-68npb\") on node \"crc\" DevicePath \"\"" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.641624 4740 generic.go:334] "Generic (PLEG): container finished" podID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerID="bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b" exitCode=0 Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.641687 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96zj9" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.641689 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerDied","Data":"bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b"} Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.641837 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96zj9" event={"ID":"af1438ed-8c8c-4842-a32b-0e19371debe8","Type":"ContainerDied","Data":"582d0b6eb6afe332f9493af76bee62d05b1d5ca336966602cd271be16b5dbc87"} Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.641878 4740 scope.go:117] "RemoveContainer" containerID="bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.681836 4740 scope.go:117] "RemoveContainer" containerID="8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.697746 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96zj9"] Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.708894 4740 scope.go:117] "RemoveContainer" containerID="abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.715467 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-96zj9"] Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.761639 4740 scope.go:117] "RemoveContainer" containerID="bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b" Jan 05 15:23:18 crc kubenswrapper[4740]: E0105 15:23:18.762588 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b\": container with ID starting with bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b not found: ID does not exist" containerID="bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.762643 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b"} err="failed to get container status \"bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b\": rpc error: code = NotFound desc = could not find container \"bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b\": container with ID starting with bba7927e8ca1344881cde8f0021ab3566d465e9c3409c885a00136c3306f833b not found: ID does not exist" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.762674 4740 scope.go:117] "RemoveContainer" containerID="8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4" Jan 05 15:23:18 crc kubenswrapper[4740]: E0105 15:23:18.763200 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4\": container with ID starting with 8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4 not found: ID does not exist" containerID="8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.763232 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4"} err="failed to get container status \"8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4\": rpc error: code = NotFound desc = could not find container \"8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4\": container with ID starting with 8ffb40989e83fc3a5e951a42c9aa005a6c1301e3ef65bfc31590f3acdcac91e4 not found: ID does not exist" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.763250 4740 scope.go:117] "RemoveContainer" containerID="abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6" Jan 05 15:23:18 crc kubenswrapper[4740]: E0105 15:23:18.763551 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6\": container with ID starting with abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6 not found: ID does not exist" containerID="abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.763603 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6"} err="failed to get container status \"abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6\": rpc error: code = NotFound desc = could not find container \"abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6\": container with ID starting with abb880cfcadab3591c54fb08aa0bf9f52b72dce364b3bba8974e92dcb51b93e6 not found: ID does not exist" Jan 05 15:23:18 crc kubenswrapper[4740]: I0105 15:23:18.982468 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" path="/var/lib/kubelet/pods/af1438ed-8c8c-4842-a32b-0e19371debe8/volumes" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.655627 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2slkf"] Jan 05 15:23:21 crc kubenswrapper[4740]: E0105 15:23:21.656257 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="extract-utilities" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.656274 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="extract-utilities" Jan 05 15:23:21 crc kubenswrapper[4740]: E0105 15:23:21.656296 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="registry-server" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.656305 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="registry-server" Jan 05 15:23:21 crc kubenswrapper[4740]: E0105 15:23:21.656326 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="extract-content" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.656333 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="extract-content" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.656626 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1438ed-8c8c-4842-a32b-0e19371debe8" containerName="registry-server" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.659144 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.668332 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2slkf"] Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.785931 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m588\" (UniqueName: \"kubernetes.io/projected/7b5b3abe-385c-4b43-a030-cc689f4014d7-kube-api-access-6m588\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.786407 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-utilities\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.786617 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-catalog-content\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.889271 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m588\" (UniqueName: \"kubernetes.io/projected/7b5b3abe-385c-4b43-a030-cc689f4014d7-kube-api-access-6m588\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.889374 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-utilities\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.889526 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-catalog-content\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.889945 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-utilities\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.890051 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-catalog-content\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.914626 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m588\" (UniqueName: \"kubernetes.io/projected/7b5b3abe-385c-4b43-a030-cc689f4014d7-kube-api-access-6m588\") pod \"community-operators-2slkf\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:21 crc kubenswrapper[4740]: I0105 15:23:21.990134 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:22 crc kubenswrapper[4740]: I0105 15:23:22.504621 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2slkf"] Jan 05 15:23:22 crc kubenswrapper[4740]: W0105 15:23:22.516391 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b5b3abe_385c_4b43_a030_cc689f4014d7.slice/crio-c44c456a9e5a92719ded540f239806606bf11b8fdc4cbe28e3df9b670a449c89 WatchSource:0}: Error finding container c44c456a9e5a92719ded540f239806606bf11b8fdc4cbe28e3df9b670a449c89: Status 404 returned error can't find the container with id c44c456a9e5a92719ded540f239806606bf11b8fdc4cbe28e3df9b670a449c89 Jan 05 15:23:22 crc kubenswrapper[4740]: I0105 15:23:22.682638 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerStarted","Data":"c44c456a9e5a92719ded540f239806606bf11b8fdc4cbe28e3df9b670a449c89"} Jan 05 15:23:23 crc kubenswrapper[4740]: I0105 15:23:23.700424 4740 generic.go:334] "Generic (PLEG): container finished" podID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerID="7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58" exitCode=0 Jan 05 15:23:23 crc kubenswrapper[4740]: I0105 15:23:23.700796 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerDied","Data":"7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58"} Jan 05 15:23:24 crc kubenswrapper[4740]: I0105 15:23:24.713707 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerStarted","Data":"12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5"} Jan 05 15:23:25 crc kubenswrapper[4740]: I0105 15:23:25.725985 4740 generic.go:334] "Generic (PLEG): container finished" podID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerID="12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5" exitCode=0 Jan 05 15:23:25 crc kubenswrapper[4740]: I0105 15:23:25.726087 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerDied","Data":"12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5"} Jan 05 15:23:26 crc kubenswrapper[4740]: I0105 15:23:26.741463 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerStarted","Data":"5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3"} Jan 05 15:23:31 crc kubenswrapper[4740]: I0105 15:23:31.990228 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:31 crc kubenswrapper[4740]: I0105 15:23:31.990693 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:32 crc kubenswrapper[4740]: I0105 15:23:32.050241 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:32 crc kubenswrapper[4740]: I0105 15:23:32.066035 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2slkf" podStartSLOduration=8.556081309 podStartE2EDuration="11.066019389s" podCreationTimestamp="2026-01-05 15:23:21 +0000 UTC" firstStartedPulling="2026-01-05 15:23:23.707599895 +0000 UTC m=+5653.014508474" lastFinishedPulling="2026-01-05 15:23:26.217537975 +0000 UTC m=+5655.524446554" observedRunningTime="2026-01-05 15:23:26.762263433 +0000 UTC m=+5656.069172052" watchObservedRunningTime="2026-01-05 15:23:32.066019389 +0000 UTC m=+5661.372927958" Jan 05 15:23:32 crc kubenswrapper[4740]: I0105 15:23:32.874196 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:33 crc kubenswrapper[4740]: I0105 15:23:33.440784 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2slkf"] Jan 05 15:23:34 crc kubenswrapper[4740]: I0105 15:23:34.839528 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2slkf" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="registry-server" containerID="cri-o://5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3" gracePeriod=2 Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.511169 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.667323 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m588\" (UniqueName: \"kubernetes.io/projected/7b5b3abe-385c-4b43-a030-cc689f4014d7-kube-api-access-6m588\") pod \"7b5b3abe-385c-4b43-a030-cc689f4014d7\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.667461 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-catalog-content\") pod \"7b5b3abe-385c-4b43-a030-cc689f4014d7\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.667738 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-utilities\") pod \"7b5b3abe-385c-4b43-a030-cc689f4014d7\" (UID: \"7b5b3abe-385c-4b43-a030-cc689f4014d7\") " Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.668222 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-utilities" (OuterVolumeSpecName: "utilities") pod "7b5b3abe-385c-4b43-a030-cc689f4014d7" (UID: "7b5b3abe-385c-4b43-a030-cc689f4014d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.669251 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.673905 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5b3abe-385c-4b43-a030-cc689f4014d7-kube-api-access-6m588" (OuterVolumeSpecName: "kube-api-access-6m588") pod "7b5b3abe-385c-4b43-a030-cc689f4014d7" (UID: "7b5b3abe-385c-4b43-a030-cc689f4014d7"). InnerVolumeSpecName "kube-api-access-6m588". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.740149 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b5b3abe-385c-4b43-a030-cc689f4014d7" (UID: "7b5b3abe-385c-4b43-a030-cc689f4014d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.770888 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b5b3abe-385c-4b43-a030-cc689f4014d7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.770924 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m588\" (UniqueName: \"kubernetes.io/projected/7b5b3abe-385c-4b43-a030-cc689f4014d7-kube-api-access-6m588\") on node \"crc\" DevicePath \"\"" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.849935 4740 generic.go:334] "Generic (PLEG): container finished" podID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerID="5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3" exitCode=0 Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.849980 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerDied","Data":"5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3"} Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.850012 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2slkf" event={"ID":"7b5b3abe-385c-4b43-a030-cc689f4014d7","Type":"ContainerDied","Data":"c44c456a9e5a92719ded540f239806606bf11b8fdc4cbe28e3df9b670a449c89"} Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.850015 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2slkf" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.850037 4740 scope.go:117] "RemoveContainer" containerID="5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.885465 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2slkf"] Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.888345 4740 scope.go:117] "RemoveContainer" containerID="12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5" Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.898955 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2slkf"] Jan 05 15:23:35 crc kubenswrapper[4740]: I0105 15:23:35.928300 4740 scope.go:117] "RemoveContainer" containerID="7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.853351 4740 scope.go:117] "RemoveContainer" containerID="5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3" Jan 05 15:23:36 crc kubenswrapper[4740]: E0105 15:23:36.854751 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3\": container with ID starting with 5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3 not found: ID does not exist" containerID="5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.855179 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3"} err="failed to get container status \"5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3\": rpc error: code = NotFound desc = could not find container \"5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3\": container with ID starting with 5d4a86aec2f488465fb99f5a7c6317e313bc0e186417ae2367e4c4a94a9969d3 not found: ID does not exist" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.855226 4740 scope.go:117] "RemoveContainer" containerID="12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5" Jan 05 15:23:36 crc kubenswrapper[4740]: E0105 15:23:36.855585 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5\": container with ID starting with 12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5 not found: ID does not exist" containerID="12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.855619 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5"} err="failed to get container status \"12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5\": rpc error: code = NotFound desc = could not find container \"12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5\": container with ID starting with 12b525f9098b6ffadc1fcadc01e351614f1cafe9c0a23592be96a6f4d2ba3fa5 not found: ID does not exist" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.855638 4740 scope.go:117] "RemoveContainer" containerID="7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58" Jan 05 15:23:36 crc kubenswrapper[4740]: E0105 15:23:36.856029 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58\": container with ID starting with 7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58 not found: ID does not exist" containerID="7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.856090 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58"} err="failed to get container status \"7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58\": rpc error: code = NotFound desc = could not find container \"7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58\": container with ID starting with 7289e2c6461c42792eb5cf613e67124b8a7d11048432376a12efc7e43dd14a58 not found: ID does not exist" Jan 05 15:23:36 crc kubenswrapper[4740]: I0105 15:23:36.982406 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" path="/var/lib/kubelet/pods/7b5b3abe-385c-4b43-a030-cc689f4014d7/volumes" Jan 05 15:25:01 crc kubenswrapper[4740]: I0105 15:25:01.916094 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:25:01 crc kubenswrapper[4740]: I0105 15:25:01.916845 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:25:31 crc kubenswrapper[4740]: I0105 15:25:31.916398 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:25:31 crc kubenswrapper[4740]: I0105 15:25:31.916984 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:26:01 crc kubenswrapper[4740]: I0105 15:26:01.916583 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:26:01 crc kubenswrapper[4740]: I0105 15:26:01.917189 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 05 15:26:01 crc kubenswrapper[4740]: I0105 15:26:01.917237 4740 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xf724" Jan 05 15:26:01 crc kubenswrapper[4740]: I0105 15:26:01.918151 4740 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10a41b1b661894249080ff040394f00a60665b6e5a3fd43b32aaeda3aea23239"} pod="openshift-machine-config-operator/machine-config-daemon-xf724" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 05 15:26:01 crc kubenswrapper[4740]: I0105 15:26:01.918206 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" containerID="cri-o://10a41b1b661894249080ff040394f00a60665b6e5a3fd43b32aaeda3aea23239" gracePeriod=600 Jan 05 15:26:02 crc kubenswrapper[4740]: I0105 15:26:02.709430 4740 generic.go:334] "Generic (PLEG): container finished" podID="7737db78-0989-433f-968a-7e5b441b7537" containerID="10a41b1b661894249080ff040394f00a60665b6e5a3fd43b32aaeda3aea23239" exitCode=0 Jan 05 15:26:02 crc kubenswrapper[4740]: I0105 15:26:02.709483 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerDied","Data":"10a41b1b661894249080ff040394f00a60665b6e5a3fd43b32aaeda3aea23239"} Jan 05 15:26:02 crc kubenswrapper[4740]: I0105 15:26:02.709944 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xf724" event={"ID":"7737db78-0989-433f-968a-7e5b441b7537","Type":"ContainerStarted","Data":"151dc1e1d9ee6fe81cf5d5bb6b83cc38c2f6667d962ba39f62ff84bea5fe5e09"} Jan 05 15:26:02 crc kubenswrapper[4740]: I0105 15:26:02.709968 4740 scope.go:117] "RemoveContainer" containerID="cab8d3ffd701ec7ce2119fe9cde4ac20fc9497ad653476736211b5bbe8ab1e4b" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.230706 4740 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wtmw2"] Jan 05 15:27:07 crc kubenswrapper[4740]: E0105 15:27:07.231810 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="extract-content" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.231829 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="extract-content" Jan 05 15:27:07 crc kubenswrapper[4740]: E0105 15:27:07.231869 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="extract-utilities" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.231878 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="extract-utilities" Jan 05 15:27:07 crc kubenswrapper[4740]: E0105 15:27:07.231902 4740 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="registry-server" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.231910 4740 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="registry-server" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.232297 4740 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5b3abe-385c-4b43-a030-cc689f4014d7" containerName="registry-server" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.234488 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.261682 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wtmw2"] Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.335532 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvrsd\" (UniqueName: \"kubernetes.io/projected/af51479a-2b1c-40fd-a6c1-611d101dbf93-kube-api-access-tvrsd\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.335718 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-utilities\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.335762 4740 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-catalog-content\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.437784 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-catalog-content\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.438029 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvrsd\" (UniqueName: \"kubernetes.io/projected/af51479a-2b1c-40fd-a6c1-611d101dbf93-kube-api-access-tvrsd\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.438260 4740 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-utilities\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.438343 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-catalog-content\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.438593 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-utilities\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.838051 4740 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvrsd\" (UniqueName: \"kubernetes.io/projected/af51479a-2b1c-40fd-a6c1-611d101dbf93-kube-api-access-tvrsd\") pod \"certified-operators-wtmw2\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:07 crc kubenswrapper[4740]: I0105 15:27:07.861111 4740 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:08 crc kubenswrapper[4740]: I0105 15:27:08.359869 4740 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wtmw2"] Jan 05 15:27:08 crc kubenswrapper[4740]: W0105 15:27:08.371900 4740 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf51479a_2b1c_40fd_a6c1_611d101dbf93.slice/crio-03693efad99b14223f9f45271b27958e39ace6d9d4c41c243a9f16ba6033123d WatchSource:0}: Error finding container 03693efad99b14223f9f45271b27958e39ace6d9d4c41c243a9f16ba6033123d: Status 404 returned error can't find the container with id 03693efad99b14223f9f45271b27958e39ace6d9d4c41c243a9f16ba6033123d Jan 05 15:27:08 crc kubenswrapper[4740]: I0105 15:27:08.540188 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerStarted","Data":"03693efad99b14223f9f45271b27958e39ace6d9d4c41c243a9f16ba6033123d"} Jan 05 15:27:09 crc kubenswrapper[4740]: I0105 15:27:09.551362 4740 generic.go:334] "Generic (PLEG): container finished" podID="af51479a-2b1c-40fd-a6c1-611d101dbf93" containerID="1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae" exitCode=0 Jan 05 15:27:09 crc kubenswrapper[4740]: I0105 15:27:09.551462 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerDied","Data":"1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae"} Jan 05 15:27:11 crc kubenswrapper[4740]: I0105 15:27:11.584816 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerStarted","Data":"cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d"} Jan 05 15:27:12 crc kubenswrapper[4740]: I0105 15:27:12.599509 4740 generic.go:334] "Generic (PLEG): container finished" podID="af51479a-2b1c-40fd-a6c1-611d101dbf93" containerID="cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d" exitCode=0 Jan 05 15:27:12 crc kubenswrapper[4740]: I0105 15:27:12.599605 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerDied","Data":"cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d"} Jan 05 15:27:13 crc kubenswrapper[4740]: I0105 15:27:13.611961 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerStarted","Data":"93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d"} Jan 05 15:27:13 crc kubenswrapper[4740]: I0105 15:27:13.638905 4740 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wtmw2" podStartSLOduration=3.08937532 podStartE2EDuration="6.638887585s" podCreationTimestamp="2026-01-05 15:27:07 +0000 UTC" firstStartedPulling="2026-01-05 15:27:09.555364157 +0000 UTC m=+5878.862272746" lastFinishedPulling="2026-01-05 15:27:13.104876432 +0000 UTC m=+5882.411785011" observedRunningTime="2026-01-05 15:27:13.63649875 +0000 UTC m=+5882.943407339" watchObservedRunningTime="2026-01-05 15:27:13.638887585 +0000 UTC m=+5882.945796174" Jan 05 15:27:17 crc kubenswrapper[4740]: I0105 15:27:17.861714 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:17 crc kubenswrapper[4740]: I0105 15:27:17.862260 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:17 crc kubenswrapper[4740]: I0105 15:27:17.934714 4740 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:18 crc kubenswrapper[4740]: I0105 15:27:18.747749 4740 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:18 crc kubenswrapper[4740]: I0105 15:27:18.814694 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wtmw2"] Jan 05 15:27:20 crc kubenswrapper[4740]: I0105 15:27:20.711440 4740 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wtmw2" podUID="af51479a-2b1c-40fd-a6c1-611d101dbf93" containerName="registry-server" containerID="cri-o://93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d" gracePeriod=2 Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.331042 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.493750 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-utilities\") pod \"af51479a-2b1c-40fd-a6c1-611d101dbf93\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.494131 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvrsd\" (UniqueName: \"kubernetes.io/projected/af51479a-2b1c-40fd-a6c1-611d101dbf93-kube-api-access-tvrsd\") pod \"af51479a-2b1c-40fd-a6c1-611d101dbf93\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.494252 4740 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-catalog-content\") pod \"af51479a-2b1c-40fd-a6c1-611d101dbf93\" (UID: \"af51479a-2b1c-40fd-a6c1-611d101dbf93\") " Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.495025 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-utilities" (OuterVolumeSpecName: "utilities") pod "af51479a-2b1c-40fd-a6c1-611d101dbf93" (UID: "af51479a-2b1c-40fd-a6c1-611d101dbf93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.504202 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af51479a-2b1c-40fd-a6c1-611d101dbf93-kube-api-access-tvrsd" (OuterVolumeSpecName: "kube-api-access-tvrsd") pod "af51479a-2b1c-40fd-a6c1-611d101dbf93" (UID: "af51479a-2b1c-40fd-a6c1-611d101dbf93"). InnerVolumeSpecName "kube-api-access-tvrsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.569620 4740 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af51479a-2b1c-40fd-a6c1-611d101dbf93" (UID: "af51479a-2b1c-40fd-a6c1-611d101dbf93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.596635 4740 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvrsd\" (UniqueName: \"kubernetes.io/projected/af51479a-2b1c-40fd-a6c1-611d101dbf93-kube-api-access-tvrsd\") on node \"crc\" DevicePath \"\"" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.596669 4740 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.596680 4740 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af51479a-2b1c-40fd-a6c1-611d101dbf93-utilities\") on node \"crc\" DevicePath \"\"" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.759153 4740 generic.go:334] "Generic (PLEG): container finished" podID="af51479a-2b1c-40fd-a6c1-611d101dbf93" containerID="93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d" exitCode=0 Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.759241 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerDied","Data":"93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d"} Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.759298 4740 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wtmw2" event={"ID":"af51479a-2b1c-40fd-a6c1-611d101dbf93","Type":"ContainerDied","Data":"03693efad99b14223f9f45271b27958e39ace6d9d4c41c243a9f16ba6033123d"} Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.759342 4740 scope.go:117] "RemoveContainer" containerID="93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.759616 4740 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wtmw2" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.788877 4740 scope.go:117] "RemoveContainer" containerID="cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.810745 4740 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wtmw2"] Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.819887 4740 scope.go:117] "RemoveContainer" containerID="1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.827478 4740 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wtmw2"] Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.883223 4740 scope.go:117] "RemoveContainer" containerID="93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d" Jan 05 15:27:21 crc kubenswrapper[4740]: E0105 15:27:21.884364 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d\": container with ID starting with 93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d not found: ID does not exist" containerID="93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.884491 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d"} err="failed to get container status \"93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d\": rpc error: code = NotFound desc = could not find container \"93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d\": container with ID starting with 93c0523a12321523dfb19f5a480b86f11a10a1f028e2c96bbd84187dc764cd5d not found: ID does not exist" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.884583 4740 scope.go:117] "RemoveContainer" containerID="cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d" Jan 05 15:27:21 crc kubenswrapper[4740]: E0105 15:27:21.885091 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d\": container with ID starting with cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d not found: ID does not exist" containerID="cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.885135 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d"} err="failed to get container status \"cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d\": rpc error: code = NotFound desc = could not find container \"cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d\": container with ID starting with cf79a3bde3eeceee479ac9bbd701f04b56be3f99da128c36e7fe567b0bacd73d not found: ID does not exist" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.885155 4740 scope.go:117] "RemoveContainer" containerID="1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae" Jan 05 15:27:21 crc kubenswrapper[4740]: E0105 15:27:21.885612 4740 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae\": container with ID starting with 1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae not found: ID does not exist" containerID="1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae" Jan 05 15:27:21 crc kubenswrapper[4740]: I0105 15:27:21.885725 4740 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae"} err="failed to get container status \"1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae\": rpc error: code = NotFound desc = could not find container \"1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae\": container with ID starting with 1807d2982b7bdf890afca2b42d82f6f1b95ea4bdeba39dcb4b4ba80928bfb1ae not found: ID does not exist" Jan 05 15:27:22 crc kubenswrapper[4740]: I0105 15:27:22.994617 4740 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af51479a-2b1c-40fd-a6c1-611d101dbf93" path="/var/lib/kubelet/pods/af51479a-2b1c-40fd-a6c1-611d101dbf93/volumes" Jan 05 15:28:31 crc kubenswrapper[4740]: I0105 15:28:31.916675 4740 patch_prober.go:28] interesting pod/machine-config-daemon-xf724 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 05 15:28:31 crc kubenswrapper[4740]: I0105 15:28:31.918375 4740 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xf724" podUID="7737db78-0989-433f-968a-7e5b441b7537" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"